Understanding Decisions

Published:

We make a lot of decisions each day. Most of them are easy to make, even when they might matter more than we realize. Others are small, even forgettable. Then there are some decisions that are really hard to make, and often they have a more significant outcome.

But difficulty and consequence of a decision don’t stand in direct relation. A trivial decision can unfold into something unexpected, just as a decision that took real effort may lead to almost no change in direction. The weight of an outcome doesn’t always match the effort it took to choose.

And making a decision on same set of choices is mostly unique process for each individual. Given the same choices, one might decide on one without any thought as the best choice, while one might have to think hard between two or more best options for them, whereas one might be trying to decide on which to take because all options are bad for them.

Given the act of making a decision is complicated, I always wanted to understand more about it, and, at least, answer the following questions.

  1. What counts as a decision? No, but really — most definitions say something like “A conclusion reached after consideration.” Is consideration a must for making a decision? Can computers make decisions?
  2. What is our internal process of making a decision?
  3. How many decisions we make every day?
  4. What makes a decision easy or hard to make? What kind of context actually shapes this difficulty?
  5. How much of the easy decisions actually needed more thinking, and how much of hard decisions wouldn’t really matter?
  6. Why are some people better at making decisions while others struggle with it?
  7. How do you learn to make better decisions? Do you simply get better at it when you get more life experience?
  8. How do decisions interact with our identity and principles?
  9. How do we evaluate decisions after the fact?

While I have more questions that are mostly unanswered, I am reasonably happy with the answers I found for above over many years of thinking, observing, and learning. At the end of this writing, I answer the questions above.

This piece reflects about three years of intermittent thinking and reading on the topic. Any feedback or thoughts are welcome.

Table of Contents

Choices

At every moment, life presents us with choices. We decide what to eat, which route to take, whether to speak or stay silent, to call parents tonight or not. Some of these choices feel trivial and automatic; others are slow, effortful, and emotionally charged. That contrast is what naturally raises deeper questions: are all of these really “decisions” in the same sense? At what point does a reaction become a decision rather than a habit or a reflex? And what is actually happening in the mind when we decide?

Decision-making is fundamentally the cognitive process of selecting among alternatives. Cognitive science usually treats a decision as requiring at least two options and some kind of evaluation, even if that evaluation happens quickly and without much effort. But how much of this process must be conscious? Current evidence suggests that the answer depends on the situation: routine choices can be carried out largely by automatic, intuitive mechanisms, while more complex or uncertain choices recruit slower, deliberate reasoning.

When Does Choice Become Decision?

Benjamin Libet’s famous 1983 experiments revealed that brain activity loinked to voluntary movements begins about 550 milliseconds before we become consciously aware of deciding to act. In other words, the brain’s “readiness potential” appears more than half a second before the reported moment of choice. This finding sparked decades of debate about free will and consciousness, but the contemporary interpretation is more nuanced. Most researchers now agree that unconscious processes strongly shape our behavior, yet they also distinguish between simple, automatic responses and genuinely deliberative decisions that involve conscious evaluation of alternatives and consequences.

Daniel Kahneman’s dual-process model makes this distinction clearer. System 1 is fast, automatic, and intuitive; it manages countless routine judgments and choices without conscious effort. When you recognize an angry face or understand a simple sentence, System 1 is at work. System 2, by contrast, is slow, deliberate, and analytical. It engages when a task is too difficult for System 1, or when our intuitions clash with our goals. Complex calculations, careful reasoning, and unfamiliar situations all require System 2’s focused, effortful processing.

Most everyday “decisions” fall along a continuum between these two modes. We move through this space with the help of what neuroscientist Antonio Damasio calls somatic markers: bodily feelings that tag possible outcomes as good or bad in light of past experience. When we weigh a choice, we do not only think; we also feel. The ventromedial prefrontal cortex reactivates emotional states linked to similar past situations, while the amygdala delivers rapid assessments of threat and reward. Patients with damage to these regions often make disastrously poor real-world decisions despite having intact IQ, showing that emotion is not the enemy of good judgment but a crucial part of it.

Can computers genuinely make decisions? Today we even have computer-science programs called decision-making, but the question still goes to the core of what we mean by deciding. AI systems select among options using algorithms and pattern recognition, often with superhuman speed and reliability. The Chinese room thought experiment is no longer a thought experiment today since LLMs can speak both Chinese and English. But we already we know that the LLMs we are using doesn’t have any consciousness, real understanding, and the subjective experience that shape human choices. Today’s AI systems simply manipulate symbols without understanding their meaning or having any genuine intentions. They they do is better described as optimization than deciding, though the functional similarities grow more compelling as AI systems become more sophisticated.

The Hidden Machinery of Human Choice

Inside the brain, decisions arise from the coordinated activity of several interacting systems.

The orbitofrontal cortex represents the utility value of specific options and maintains links between cues, actions, and their likely outcomes; in practice, it helps you learn that “this kind of choice usually leads to that kind of result,” and it updates those expectations when your internal state changes. For example, when food looks less appealing after you’re already full, or when social approval suddenly matters more than saving time.

The anterior cingulate cortex integrates different decision variables — such as reward size, probability of success, required effort, and potential risk — into a roughly common scale. This allows you to compare very different options (like “work late for a promotion” versus “go out with friends now”) in a way that feels like a single choice rather than a set of unrelated trade-offs.

The lateral prefrontal cortex maintains relevant information in working memory and directs attention according to your current goals. It is what lets you keep the rules of a task in mind, resist distractions, and follow through on a chosen plan instead of drifting toward whatever is most immediately tempting.

When we face a choice, these systems engage in a dynamic process. Neural activity related to different options builds up over time, with groups of neurons gradually increasing their firing until one option crosses a decision threshold and is selected. The brain evaluates options through both parallel processing (assessing multiple features simultaneously) and serial processing (comparing alternatives sequentially). This process is not purely computational or detached. The amygdala and insula generate somatic states that bias our selections before we’re consciously aware of them, while the prefrontal regions can override these intuitive responses when necessary.

Thus, the actual decision-making process unfolds in stages: representing available alternatives and internal states, evaluating external variables like rewards and risks, computing action values that account for effort and delay, comparing options and selecting one, then evaluating the outcome to generate prediction errors that inform future choices. This multi-stage architecture enables flexible, adaptive decision-making ranging from split-second intuitions to careful deliberation, with emotions and reasoning working together rather than in opposition.

Counting Decisions

How many decisions do we make daily? The widely cited figure of 35,000 decisions per day lacks scientific foundation, and there are no peer-reviewed work measuring this number. There is a popular research about people making at least 200 daily food-related decisions, but the counting methods were challenged.

It’s likely reasonable to say that we make hundreds of choices every day. In my personal experiment of counting deliberate decisions over a few days, I ended up getting around 100-250 conscious decisions daily. Note that the self-experiment aspect biases the numbers, but I expect these numbers to be semi-accurate.

Self-experiment logs It is interesting that a lot of the basic decisions would vary in numbers based on the environment and randomness. For example, when I wake up, I decide if I want to check my phone or not. During my morning shower, I tend to make 4-6 decisions about planning my day. I also make 2-3 decisions on reflecting about the past events -- these decisions are in the style of, "I should have done X rather than Y." After my shower, I usually decided if it's worth checking the weather based on how the outside looks from the window. Then I'd decide on what shirt/sweater to wear. I have exactly one pair of pants and shoes, so I don't decide on them. Then I'd decide if I should try to catch the bus or take a walk. During my commute, I'd make a few work related decisions. By the time I arrived to work, around 9-10AM, I already would have 15-20 decisions made. During work, most of my decisions would be around work itself, and some about the lunch/snacks and drinks. I keep a bottle of water on my desk, and I realized that I make a decision to drink or not drink from it about every half an hour. There's also a decision to go to the bathroom every hour, even if I don't need to use the bathroom (then it's just a walk). By end of work, I had made anywhere between 80 and 150 decisions. After work, there are a few food related decisions -- where to eat or cook at home, what to eat, how many calories I have left in today's deficit, and so on. I also realized that I tend to make a decision of doing something or not, even if I am going to do it based on my plans; they are intrusive thoughts causing a decision, but sometimes they are helpful. I make a few more decisions about what to do with my evening, and a few more reflection decisions before sleeping.

While I was working, life was reasonably boring (in a good way) and I didn’t get to make many “bad decisions.” However, if I did the same experiment on one of my school days, I’d likely count a bit more decisions with more being in the “bad” zone.

Difficulty of Decisions

What makes some decisions easy while others feel agonizing? Cognitive load plays the main role here. When working memory is heavily taxed, decision quality drops: we lean more on heuristics, show stronger biases, and struggle with complex trade-offs. Stakes matter as well. High-stakes choices trigger stronger emotional responses and stress, which can either sharpen focus or impair judgment. Uncertainty — whether from ambiguous evidence, unknown probabilities, or complex interactions — makes decisions much harder. Time pressure pushes us toward simpler strategies; this can sometimes improve speed without much loss of accuracy, but in other cases it increases errors or distorts our preferences. There is also some controversial work suggesting that making a lot of decisions, especially under difficulty, causes decision fatigue.

The concept of decision fatigue captured public imagination through Roy Baumeister’s ego depletion research, which suggested that self-control operates like a muscle that becomes exhausted with use. The famous finding that judges grant parole 65% of the time early in the day but nearly 0% before breaks seemed to confirm that sequential decisions deplete mental resources. However, ego depletion research has become a textbook case of the replication crisis. Large-scale replications failed to find the effect, and critics now argue the phenomenon doesn’t exist as originally described.

What remains clear is that sequential demanding tasks do impair performance, but the underlying mechanism, whether it reflects ego depletion, motivational shifts, or cognitive fatigue, remains under question.

The Importance Paradox

We sometimes lavish attention on decisions that matter little and rush through choices with serious, long-term implications. This mismatch between perceived and actual importance creates one of decision-making’s main ironies. When options have roughly equal risk-reward profiles, overthinking becomes costly; that’s what philosophers call “Buridan’s Ass” problem, where a donkey placed exactly between food and water starves because it has no reason to choose one over the other.

Several well-known biases quietly amplify this problem. The planning fallacy shows that we consistently underestimate how long tasks will take, even when we’ve seen similar tasks overrun in the past. The sunk cost fallacy shows how past investments cloud present judgment, nudging us to keep backing losing plans simply to avoid recognizing a loss. In both cases, we make our decisions harder by giving weight to things that should be irrelevant (sunk costs) or by trusting our hopeful estimates instead of the track record and base rates that would guide us better.

Environmental factors shape choices more than we recognize. The aforementioned 200 food-related decisions might be counting wrong, but it showed that showed that people eating from larger bowls consume 31% more food, yet 52% deny eating more and only 4% attribute it to bowl size. Broader research supports the underlying point: subtle features of context — defaults, layout, framing, and design details — can reliably shift behavior without our awareness. That means many “easy” decisions about how we set up our surroundings are more important than they appear, because their effects accumulate over time.

By contrast, many choices that feel momentous are less decisive than we imagine. For many everyday decisions where several options are genuinely good, their long-term impact on our happiness tends to be less dramatic than we predict; we adapt to the path we take and usually find ways to feel reasonably satisfied with it. Additionally, a popular maximizer-vs-satisfactor research shows that while maximizers end up searching for and actually finding a better opportunity, but they report lower happiness with their decisions and life overall; in the other hand, satisficers tend to be more satisfied and less prone to regret, even when their outcomes are not objectively superior. The act of drawing a line (“this is good enough”) and then moving on appears protective against endless comparison and what-if thinking.

Why Some People are Better Decision-Makers

Individual differences in decision-making are substantial and predictable. General intelligence consistently emerges as the strongest cognitive predictor of decision competence, followed closely by working memory capacity and numeracy. These abilities enable people to process complex information, consider multiple options simultaneously, and calculate trade-offs accurately.

A more recent study found that, conscientiousness, openness, and emotional bonds and stability are also key factors in decision-making. Conscientiousness, a personality trait characterized by goal-directed planning and self-discipline, is strongly correlated with the cognitive factors. Openness to experience promotes consideration of diverse perspectives and information seeking, and it helps to more well-rounded decisions. Being more calm-minded helps to not get affected by irrational emotional biases.

To understand how expert decision-makers work, we can analyze experts at important work. Through his Recognition-Primed Decision model, Gary Klein showed that firefighters, nurses, and military officers don’t compare options analytically; they recognize patterns from vast experience and simulate outcomes mentally. Expert intuition relies on pattern recognition developed through thousands of hours in domains with regular patterns and accurate feedback.

However, expertise has strict limits. The landmark 2009 agreement between Klein and Kahneman identified conditions necessary for valid intuitive expertise: environments must be sufficiently regular and predictable to allow pattern learning, with opportunities for prolonged practice and accurate feedback. This tension between reliable expert intuition in high-validity domains and systematic error in low-validity ones also lies at the heart of controversy over Thinking, Fast and Slow, where critics argue that Kahneman sometimes overgeneralizes from biased judgment in noisy environments to intuition more broadly. Stock picking, clinical psychological diagnosis, and long-term political forecasting often lack the required conditions, so experience in these domains breeds overconfidence without much genuine skill. This helps explain why some clinical psychologists perform no better than students at diagnosis despite years of experience, while chess experts genuinely see further.

Age effects on decision-making are complex and non-linear. Cognitive mechanics such as processing speed and working memory decline from early adulthood, which can reduce decision quality when tasks are demanding. But crystallized intelligence, accumulated knowledge and experience, can offset these declines in familiar domains. Older adults also tend to allocate cognitive resources strategically to personally relevant decisions, performing well where it matters while using simpler strategies elsewhere. The wisdom of age is real but domain-specific and context-dependent.

Getting Better at Choosing

Does decision-making simply improve with life experience? No. Years of practice without accurate feedback can entrench biases rather than eliminate them. A popular study by Camerer and Johnson found that clinical psychologists with years of experience performed no better than students at psychological diagnosis. Experience in unpredictable domains creates overconfidence without skill.

Improvement requires deliberate practice with specific characteristics: well-defined tasks, immediate accurate feedback, opportunities for repetition and error correction, focused attention on weaknesses, and progressive challenge. This typically requires 10+ years for expert-level performance, but the key is quality of practice, not just quantity. As Klein notes, “most people gain expertise on their own, not through training programs,” but the practice must occur in domains where expertise is possible. Regular patterns and clear feedback is required for improvement.

Training can accelerate this process. Experiments with games and other structured interventions show lasting reductions in common cognitive biases and gains in decision competence, especially when training is interactive, gives concrete feedback, and combines multiple methods. Tools such as decision journals and premortems strengthen metacognitive skills by separating the quality of reasoning from the eventual outcome.

Personally, I have lately been playing a lot of roguelike games (Brotato, Risk of Rain 2, Vampire Survivors etc.) where I have to make decisions that might affect my run significantly. I have noticed myself improving a lot more after watching how an expert plays the game and what kind of decision-making they do. Interestingly, most experts in these games are able to play well even when making (sometimes deliberate, for fun) bad decisions or mistakes. What seems to matter more is being good at choosing the right thing most of the time. Perhaps we can take a lesson from these games and realize that we don’t have to be flawless to be on a winning track.

Who We Are Shapes Our Decisions

Decisions we make display our principles and identity. The Identity-Value Model proposes that behaviors tied to our sense of self carry greater subjective value than those that are identity-neutral, and are therefore more likely to be chosen. At the neural level, the ventromedial prefrontal cortex appears to support both identity-related processing and value computation, suggesting a deep integration between who we are and what we find worthwhile.

Our self-concept is not fixed; different aspects of identity become salient in different contexts. The same person may see themselves as an environmentalist, a student, a professional, and an athlete, but which of these identities guides behavior depends on the situation. Identities that are currently active exert more influence on decisions than those in the background. People with high self-concept clarity tend to make better decisions, especially in social settings, because they can more accurately recognize which options serve their own goals and the goals of others.

Values act as broad, cross-situational principles that shape what we regard as desirable outcomes. Cross-cultural research identifies ten basic human values arranged in a circular structure. Self-direction and stimulation (openness to change) stand in tension with security, conformity, and tradition (conservation). Power and achievement (self-enhancement) stand in tension with universalism and benevolence (self-transcendence). When we decide, we implicitly or explicitly assess options in terms of their fit with our core values. Values exert the strongest influence when they are important to us, relevant to the context, and consciously activated.

Moral Foundations Theory further shows that moral judgment grows out of six evolved psychological systems: care/harm, equality, proportionality, loyalty/betrayal, authority/subversion, and sanctity/degradation. These systems shape choices through fast, intuitive responses rather than slow, deliberate reasoning. Often, moral reasoning comes afterward, serving mainly to justify our initial intuitions. Political disagreements about morality can be understood as differences in how heavily people rely on each foundation: for example, liberals tend to emphasize care and equality, whereas conservatives draw more evenly on all six.

Taken together, identity, values, and moral intuitions mean that decisions are never purely “objective.” Career moves, relationship choices, ethical judgments, and even consumer behavior are all channels through which we maintain our identities and express our values, not just ways of maximizing utility. This is not a flaw in human decision-making. It is what allows us to be consistent over time, coordinate with others, and build lives that feel meaningful.

The Distorted Mirror of Hindsight

How do we evaluate decisions after the fact? Turns out, we aren’t very good at that. Our evaluations are systematically distorted by biases that creep in once outcomes are known.

One major culprit is hindsight bias — the “I knew it” effect. After we know how something turned out, we tend to see that outcome as having been more predictable than it really was. Baruch Fischhoff’s classic studies showed that people who are told an outcome later rate it as much more likely than people who are predicting it in advance. Once we know what happened, that knowledge seeps into our memory and reshapes it. We can no longer accurately reconstruct how uncertain we actually felt beforehand. This effect is not trivial: meta-analytic estimates put its size in the moderate range (d ~ 0.40–0.60), and it persists even when people are explicitly warned about it.

Outcome bias makes matters worse. We judge the quality of a decision by its result, even when we know the result was largely due to chance and even when we explicitly agree that we should focus on the decision process. In experiments by Baron and Hershey, people rated the same decision as better when it led to a good outcome than when it led to a bad one. Positive outcomes draw our attention to arguments that support the decision; negative outcomes highlight reasons against it.

Together, these biases create a core problem for fair evaluation. In an uncertain world, good decisions can lead to bad outcomes, and bad decisions can occasionally pay off. Hindsight blurs that distinction: a good decision with a bad outcome looks obviously foolish, and a reckless decision with a lucky result seems retrospectively wise. Surgeons are judged more harshly when appropriate procedures end badly. Coaches are praised or condemned based on the final score rather than the soundness of their strategy. Investors are evaluated by returns rather than by the quality of their reasoning.

Counterfactual thinking, imagining “what might have been”, adds another layer. It can be useful when it takes the form of upward, action-focused counterfactuals (“If I had done X instead of Y, I could do X next time”). Around three-quarters of spontaneous counterfactuals focus on our own actions, and when these thoughts translate into specific intentions, they can improve later performance. But counterfactuals also fuel regret, a potent negative emotion that affects both how bad we feel and how we choose in the future, often leading to regret-averse decisions.

The time course of regret is also interesting. In the short term, we tend to regret actions, the things we did that went wrong. Over longer time horizons, we regret inactions more; for example, paths not taken, degrees not pursued, relationships never started. These missed opportunities loom larger than most active mistakes. That pattern suggests that while bad decisions matter, chronic hesitation and avoided choices may be even more costly.

The key to better evaluation is to separate process quality from outcome quality. Process quality concerns what was under our control at the time: how well we gathered information, considered alternatives, reasoned about trade-offs, and aligned choices with our values and goals. Outcome quality reflects what actually happened, which is always influenced by luck, other people’s behavior, and unforeseeable events. As Baron and Hershey stressed, even the best decision cannot guarantee a good outcome; all real decisions are made under uncertainty.

Good practice therefore includes documenting reasoning and alternatives before outcomes are known, conducting decision reviews in advance of results, adopting probabilistic thinking that accepts that good decisions sometimes fail, and evaluating patterns across many decisions rather than single cases. Organizations that reward only favorable outcomes risk punishing thoughtful risk-taking and rewarding lucky recklessness. Focusing on the quality of decision processes instead enables more honest learning from both success and failure.

Personally, I have indeed seen great success with projects and companies that had a well-structured post-mortem process and focused on robust processes with both tactical and strategic solutions.

Conclusion: Nine Insights on Deciding

Based on all of above, we can finally answer my questions and gain 9 strong insights about decisions. The responses are naturally simplified and they serve as a TL;DR for above.

  1. What counts as a decision? Decision-making is the cognitive process of selecting among alternatives. Decision is the selected option among those. While simple reactive behaviors can be unconscious, genuine decisions typically involve some conscious consideration, especially for complex, multi-attribute choices (at least, in current literature). Current AI systems make functional selections but lack the intentionality, consciousness, and genuine understanding that characterize human decisions.

  2. What is our internal process? Decisions emerge from coordinated activity across brain regions: the orbitofrontal cortex encodes values, the anterior cingulate integrates variables, and the prefrontal cortex maintains working memory and implements control. The process unfolds through representing alternatives, evaluating variables, computing action values, comparing options, and evaluating outcomes. System 1 provides fast intuitions while System 2 engages deliberative reasoning, with emotions providing essential somatic markers that bias selections based on experience.

  3. How many decisions do we make daily? While we have a myth of 35,000 daily decisions, we likely make only hundreds of decisions a day. The exact number matters less than recognizing we vastly underestimate our choice frequency, particularly for routine behaviors.

  4. What makes decisions easy or hard? Cognitive load, stakes, uncertainty, time pressure, and decision complexity all increase difficulty. Context shapes difficulty through physiological state, information availability, social factors, and individual differences in cognitive abilities and expertise.

  5. How do easy and hard decisions mismatch with importance? We overthink decisions with similar alternatives while breezing through choices with compound effects. Environmental factors we barely notice often matter more than major decisions we agonize over. Many hard decisions matter less than assumed as we are good at adapting, while many easy decisions deserve more thought as small choices compound over time.

  6. Why are some people better decision-makers? Superior decision-making reflects cognitive abilities (intelligence, working memory, numeracy), personality traits (conscientiousness, openness), and expertise developed through thousands of hours in domains with regular patterns and accurate feedback. Individual differences are substantial, stable over time.

  7. How do we learn to make better decisions? Experience alone is insufficient, it’s providen that sometimes experts are as good as students. Overconfidence hurts good choices. Improvement requires deliberate practice in domains with accurate feedback. Formal training can accelerate development through debiasing interventions, interactive practice, feedback, and metacognitive skill development.

  8. How do decisions interact with identity and principles? Identity-relevant behaviors hold greater subjective value, making them more likely to be enacted. Values provide trans-situational guiding principles, while moral intuitions from evolved foundations shape rapid evaluations. Decisions both express and constitute identity. They’re how we maintain self-concept and coordinate socially.

  9. How do we evaluate decisions afterward? Poorly, due to hindsight bias (outcomes seem more predictable in retrospect), outcome bias (results over-weighted relative to process), and counterfactual thinking that fuels regret. Good evaluation requires separating process quality from outcome quality, recognizing that uncertainty makes good decisions compatible with bad outcomes, and learning from procedures rather than just results.

Understanding these points does not make decisions easy or remove uncertainty. It does, however, clarify what we can and cannot control. We can improve our processes, attend to feedback, and become more aware of our biases. We cannot control luck, other people’s actions, or genuinely unforeseeable events. The architecture of choice is not only neural hardware and cognitive algorithms, but also accumulated experience, evolving identity, and systematic distortions in how we look back. We decide with reason and emotion, with memory and imagination, and within social contexts that continuously shape both our choices and the people we become.

Sources and Further Reading

Foundational Works

Kahneman, D. (2011). Thinking, Fast and Slow. Farrar, Straus and Giroux. The definitive popular account of dual-process theory, heuristics and biases, and decades of judgment and decision-making research. Essential reading for understanding System 1 and System 2 thinking. https://en.wikipedia.org/wiki/Thinking%252C_Fast_and_Slow

Damasio, A. (1994). Descartes’ Error: Emotion, Reason, and the Human Brain. Putnam. Introduces the somatic marker hypothesis showing how emotions are essential for rational decision-making, based on studies of patients with prefrontal damage. https://en.wikipedia.org/wiki/Descartes’_Error

Klein, G. (1998). Sources of Power: How People Make Decisions. MIT Press. Presents naturalistic decision-making research showing how experts use pattern recognition and mental simulation in real-world environments. https://mitpress.ublish.com/book/sources-power

Cognitive and Neural Mechanisms

Rangel, A., Camerer, C., & Montague, P.R. (2008). “A framework for studying the neurobiology of value-based decision making.” Nature Reviews Neuroscience, 9, 545-556. Comprehensive framework for understanding neural mechanisms of decision-making across brain regions. https://www.nature.com/articles/nrn2357

Bechara, A., Damasio, H., & Damasio, A.R. (2003). “Role of the amygdala in decision-making.” Annals of the New York Academy of Sciences, 985, 356-369. Reviews research on emotional processing and decision-making, particularly the Iowa Gambling Task. https://pubmed.ncbi.nlm.nih.gov/12724171/

Individual Differences and Expertise

Bruine de Bruin, W., Parker, A.M., & Fischhoff, B. (2007). “Individual differences in adult decision-making competence.” Journal of Personality and Social Psychology, 92, 938-956. Introduces the Adult Decision-Making Competence framework for measuring decision quality across domains. https://pubmed.ncbi.nlm.nih.gov/17484614/

Kahneman, D. & Klein, G. (2009). “Conditions for intuitive expertise: A failure to disagree.” American Psychologist, 64, 515-526. Landmark reconciliation between heuristics-and-biases and naturalistic decision-making perspectives, identifying when expert intuition is valid. https://www.scirp.org/reference/referencespapers?referenceid=1904881

Decision Training and Improvement

Morewedge, C.K., Yoon, H., Scopelliti, I., et al. (2015). “Debiasing decisions: Improved decision making with a single training intervention.” Policy Insights from the Behavioral and Brain Sciences, 2, 129-140. Demonstrates that brief, game-based interventions can produce durable improvements in decision quality that transfer to new contexts. https://journals.sagepub.com/doi/10.1177/2372732215600886

Values, Identity, and Moral Foundations

Schwartz, S.H. (2012). “An overview of the Schwartz theory of basic values.” Online Readings in Psychology and Culture, 2(1). Comprehensive overview of the most influential cross-cultural theory of human values and their role in decision-making. https://scholarworks.gvsu.edu/cgi/viewcontent.cgi?article=1116&context=orpc

Haidt, J. (2012). The Righteous Mind: Why Good People Are Divided by Politics and Religion. Pantheon. Accessible introduction to Moral Foundations Theory and how moral intuitions shape judgment and decision-making. https://en.wikipedia.org/wiki/The_Righteous_Mind

Evaluation and Bias

Fischhoff, B. (1975). “Hindsight is not equal to foresight: The effect of outcome knowledge on judgment under uncertainty.” Journal of Experimental Psychology: Human Perception and Performance, 1, 288-299. Classic paper introducing systematic study of hindsight bias and its implications for decision evaluation. https://doi.org/10.1037/0096-1523.1.3.288

Roese, N.J. & Vohs, K.D. (2012). “Hindsight bias.” Perspectives on Psychological Science, 7, 411-426. Contemporary review of hindsight bias research with meta-analytic findings. https://doi.org/10.1177/1745691612454303

Epstude, K. & Roese, N.J. (2008). “The functional theory of counterfactual thinking.” Personality and Social Psychology Review, 12, 168-192. Comprehensive framework for understanding how counterfactual thinking influences behavior regulation and improvement. https://doi.org/10.1016/bs.aesp.2017.02.001