Skip to main content
Since 2004, revealing what drives you!

No, your choices are not chaotic or totally irrational… Science proves otherwise.

Deconstructing an heresy: You would be considered totally irrational if you were willing to walk 15 minutes to save 25 euros on a jacket priced at 125 euros but wouldn’t do the same to save 25 euros on a TV costing 1,500 euros, even though the savings are identical.

At this point, a basic definition is necessary: being irrational means making a decision or acting without prior reasoning. Reasoning can thus mean analyzing a situation using thought rules based, for example, on logic, coherence, or optimization.

The example above is taken from an interview with a doctor in neuropsychology who also states the following:

When we "change context, the individual does not make the same decision," and that is "totally irrational."

Yet, it is precisely the opposite! As soon as there is reasoning, regardless of its logic, we can no longer speak of irrationality. The fact that a decision changes based on context shows that these choices are not totally irrational... This is what studies demonstrate, which I will present to you and invite the reader to explore further, as this is only a brief article.

We are also told that "our decisions are chaotic" and that "our way of thinking is not based on reason."

Formulated this way, these statements are absolutely false, and I will demonstrate how decisions are, of course, influenced by context and that there is a form of rationality behind our choices, far from chaos.

This is a subject that interests me in the first place, which is why I am exploring it when I hear a statement that catches my attention. Indeed, my work often involves helping you rationalize your choices to the maximum, making them well thought out. Nevertheless, it is important not to propagate false beliefs, so let us focus on the substance.

The claims made should first be analyzed in light of the current state of science on the subject. Our way of thinking is said to be based on two distinct cognitive modes, described by Daniel Kahneman in his book Thinking, Fast and Slow (2011): System 1 and System 2. System 1 is fast, intuitive, and automatic. It operates in the background without conscious effort and relies on associations, past experiences, and heuristics to make immediate decisions. This mode of thinking allows us to instinctively react to a situation or recognize a face in a fraction of a second. It can also be summarized as a primitive emotional system. Conversely, System 2 is slower, analytical, and requires significant cognitive effort. It intervenes when a situation requires reflection, such as solving a complex mathematical problem or analyzing a contract before signing it. While System 1 is efficient for routine and quick decisions, it is also subject to cognitive biases and judgment errors. System 2, although more reliable, is energy-consuming, not free of biases, and lazy: we tend to activate it only when we are forced to. Thus, our decisions often result from a balance between these two systems, one favoring intuition and the other analytical rationality. However, separating their synergy seems questionable.

Furthermore, some researchers, such as Ulrich Schimmack, argue that this dichotomy is too simplistic and does not account for the complexity of human cognitive processes. They suggest that human thought is more fluid and that the two systems interact in a more integrated manner than the binary separation suggests. Moreover, later studies have questioned the reproducibility of some research supporting the system theory, particularly in the field of priming, raising doubts about the robustness of certain conclusions. To understand, even though it is also controversial, here is an example of priming: if I show you the word "yellow" and then ask you to give the first word that comes to mind, you are more likely to respond with "banana" rather than "table" because your brain has been primed to associate yellow with something familiar. On this question, the debate remains open.

But let’s move on to the example used and its problematic shortcuts.

In the study conducted by Daniel Kahneman and Amos Tversky, two specialists in behavioral economics, the data and contexts are entirely different from the jacket and TV example. Participants were asked to imagine they were about to buy a jacket for $125 and a calculator for $15. The calculator vendor then informed the buyer that the same calculator was on sale for $10 at another store branch, located 20 minutes away. Sixty-eight percent of those surveyed said they would be willing to make the trip to save $5 on the calculator.

However, with another group of participants, the question was modified: now, the calculator cost $125 and the jacket $15. The calculator was available at another store branch for $120. In this case, only 29% of respondents said they would make the trip. In both cases, the amount saved was the same.

Already, the first problem is that there is a gap between imagining something and actually doing it. Moreover, we note that in neither case do 100% of people react in the same way. Something else must come into play.

At the same time, one may question the intelligence of the participants and whether this variable was controlled. Indeed, in both cases, the total cost of the two desired objects is identical, and the level of savings is the same. This is far from the jacket vs. TV example. Thinking that this is simply a matter of perception, representation, or context seems like a very limited approach.

The jacket/TV example, based on my extensive research, does not exist as a formal study. It is merely an imagined scenario, in which it is considered irrational to want to save 25 euros on a jacket but not on a television. However, this is based on an excessive simplification of economic rationality—not just that. It plays on a perception of cheap vs. expensive objects and introduces other types of biases and contextual effects. It is also very different from the Kahneman and Tversky study. Their study does not just compare absolute savings but highlights different parameters, such as the perceived value of the savings based on the object's cost, context, and the effort required to save. Their experiment shows that the decision to travel for a discount depends not only on the percentage saved but also on the mental categorization of the purchase (at the very least): the same amount saved is perceived differently depending on whether it concerns a trivial purchase (a calculator) or a more significant one (a jacket). And this study concludes a limited rationality, not a "total irrationality."

And that makes complete sense, since reasoning is involved—it is merely biased.

This experiment does not serve to explain that this purchasing behavior is irrational, but rather the effects of perception.

It might be related to the "proportionality bias" or "transactional relativity bias" (mental accounting bias). It is also linked to the concept of "mental accounting", developed by Richard Thaler.

This experiment demonstrates contextual rationality. But is it really a bias in this context?

Furthermore, the concept of bounded rationality, introduced by Herbert Simon, recognizes that individuals, faced with incomplete information and limited cognitive resources, adopt satisfactory decision strategies rather than optimal ones. This means that decisions are made considering the context and specific constraints.

In practice, economic rationality is not assessed solely through simple arithmetic.

It is also linked to other characteristics tied to context and the individual.

If you take the same experiment but instead place the customer in front of two internet pages, where the effort is merely to click on one tab or another, would the average customer (or at least the type of individual who participates in such studies) naturally think and choose the cheaper option?

Yes, I mention the average subject, because if we change the individual’s context, the oil tycoon from Dubai does not have time to waste on saving 20 euros, or even 1,000 euros—he doesn’t even want this topic to cross his mind. ;)

When you earn 100,000 euros a day, this type of savings does not apply to you, and if it doesn’t apply to everyone, then it is not a cognitive bias, since a bias must be universal.

A trader will not waste five minutes to save 20 cents on a coffee. A student on a tight budget will.

This is the reality of contextual rationality.

Ignoring a financial aspect or savings based on context is entirely rational.

Thus, this behavior, far from being a bias, is a demonstration of contextual and adaptive rationality, where the value of savings is judged based on its real impact on the final decision, rather than on a simple mathematical rule disconnected from reality.


Sources

Li, Q., & Zhao, H. V. (2023). Pacos: Modeling Users' Interpretable and Context-Dependent Choices in Preference Reversals. arXiv

Thaler, R. H. (1999). "Mental Accounting Matters." Journal of Behavioral Decision Making, 12(3), 183-206.

Simon, H. A. (1957). Models of Man: Social and Rational. Wiley.

Cairn.info. (2011). Rationalité limitée et prise de décision. Cairn

"Excellence is the result of consistent improvement."

Philippe Vivier

©

Philippevivier.com. All rights reserved.

Article L122-4 of the Code of Intellectual Property: "Any representation or reproduction in whole or in part without the consent of the author [...] is illegal. The same applies to translation, adaptation or transformation, arrangement or reproduction by any art or process."

History & Infos


Practice founded in 2004.
Website and content redesigned in 2012.
SIRET NUMBER: 48990345000091

Legal information.


Addresses


  • 254 rue lecourbe
    75015 Paris
  • 23 avenue de coulaoun
    64200 Biarritz
  • 71 allée de terre vieille
    33160 St Médard en Jalles
  • 16 Pl. des Quinconces
    33000 Bordeaux

Contact