Driving Better Choices for a Healthier Life

The Impact of Behavioral Science on Healthcare Innovations

Balazs Fonagy

Balazs Fonagy

Chief Strategist

Bence Lukacs

Bence Lukacs

Experience Design Lead

Anett A. Toth

Anett A. Toth

Product Strategist

Behavior Design Psychology Healthcare
31 August, 2023

Human decision-making is flawed; we frequently make choices that are against our best interests. In this article, we will discuss how to design choices that subconsciously nudge people to make the right decisions for their health.

How to use nudges to remedy common judgment errors? 

Heuristics and biases: mental shortcuts that simplify a messy reality but can lead us astray. 

Our Automatic System takes mental shortcuts to solve problems because the number of mundane daily decisions we have to make under some level of uncertainty is extremely high. Most of the time these shortcuts lead to acceptable results but they also produce systematic errors of judgment, and irrational decisions. We call these systematic errors cognitive biases. In order to successfully implement nudges, we need to know about possible “thinking traps” and turn them to our advantage.

Cognitive Bias: Systematic errors of judgement made by our intuitive thinking in order to simplify decision-making processes. While these shortcuts may have worked well in prehistoric times, they are not always effective in modern society.


Anchoring, or the dominance of the first information, has an important role in creating nudges. Humans need a starting point to begin our thought process, and we use the first information we receive or remember as a springboard for our subsequent ideas. That’s why the expensive cars are placed at the front of the showroom: after you see their price tag, everything else will seem cheap in comparison. Anchors aren’t necessarily numbers: if the first information you hear about a doctor is that their last operation wasn’t successful, it will be hard to restore your confidence in their skills — even if they have the best track record in the city.


working with this bias: 

Lead with what matters: The first piece of information you give your customer will hold the most influence over their decisions. Lead with the one that is most likely to nudge them in the right direction. Build the narrative of your digital journey on a strong first impression and make sure that everything that follows reinforces it.

anchoring - the dominance of initial information in influencing user behavior
anchoring - the dominance of initial information in influencing user behavior

the stats don’t apply to me:  

People tend to think statistics won’t apply to them. For example, when confronted with commonly shared statistics like “40-50% of US marriages end in divorce” or “staggering 80% of restaurants fail within the first five years of operation”, we assume this won't happen to us. It's not because we are oblivious to statistics; when surveyed about average chances, people are usually quite accurate. Yet, they estimate their own chances of success as exceptionally high. The rationale? Mostly just “I am me, and they are not.” This has a definite upside: without this unfounded optimism, there would be no new restaurants to sample. Evolution prefers those who take initiative, even when the odds are stacked against them.


working with this bias: 

Make risks feel personal: This type of nudge is extremely important in healthcare products, as customer overconfidence unfortunately also applies to risk-taking. People who engage in harmful habits (such as smoking, unprotected sex, or a diet mostly consisting of processed red meat) are usually aware of the risks associated with their activity. They just think the statistics don’t apply to them and they are magically immune to harm. The solution? Focus on subjective experiences (read on for the availability bias) instead of statistics. Boring numbers leave people unmoved.

availability, or jumping from subjective experiences to statistics: 

The availability bias describes how we assess the likelihood of an event based on how readily examples of the event come to mind. When we assess the risk of being attacked by a shark in Australia, we base our calculations on what we’ve heard. If you’ve read a few shocking articles about shark attacks, you perceive the risk as quite high. In reality, cycling in Australia is 50 times riskier than swimming in the sea. Your guesstimate will also depend on how much the given event captivates your imagination or frightens you (that’s why people often overestimate the frequency of plane crashes), and how recently you heard about such an event. 


working with this bias: 

Don’t be afraid to be dramatic: When you want to make sure people take a health risk seriously, use tools that evoke emotion. Simple statistics about heart attacks might be forgotten in a second, while a terrifying video will burn into the viewer’s mind. Our Automatic System remembers and increases the perceived risk in question.

Keep information at the forefront of the mind: Intentions to act more wisely often wane as the memories of the initial trigger fade. Users require well-timed reminders and refreshers to stay on track.

loss aversion: we prefer to hold onto what we have

The average person is more afraid of losing something than excited about gaining the very same thing. On average, we are only willing to risk losing $100 if we have the chance of winning $200, and this 2:1 ratio in favor of loss aversion seems to be a constant value in different experiments. This behavior, from a probability perspective, is irrational, underscoring the human brain's indifference toward mathematical calculations.


working with this bias: 

A little scare can tip decisions: While modern marketing communication usually advocates concentrating on the value and the upside, we shouldn’t forget the power of loss aversion. To persuade people to follow the “smart path”, we have to highlight the possible losses that can occur if they do the opposite. But highlighting doesn’t mean showcasing risk statistics. It means explaining the possible loss in a way that captivates users’ imagination. It’s worth noting that it takes a careful design process to leverage this principle without causing unnecessary anxiety - especially in a field that can already cause high stress for patients.

status quo bias or creatures of comfort

It takes more effort to change something than to leave it as it is. Change involves more deliberation and unpredictability, and occasionally involves some actual legwork as well. Our energy-conserving brains tend to avoid it. This inertia is a powerful force, recognized by many designers. Freemium apps entice users with free trials, hoping they'll neglect to cancel their membership. Reality is even starker: users often don't forget to cancel, but would rather pay than exert effort to alter the existing situation.


working with this bias:

Include defaults that create the most benefit: There is a significant advantage to be gained from this powerful tendency for humans to do nothing. We can design smartly set defaults. No matter whether you are working on a digital product or a policy, people will likely just go with the default option. For example, in countries where organ donation in case of accidental death is an opt-in program, only 10-15% of the population registers for it (like when you order your NYC ID). In contrast, in countries where it is an opt-out program, more than 90% give permission. That's a massive difference. You can expect similar results in your products regardless of the context. Of course, keeping certain choices opt-in is legally mandated (like newsletter subscription), but in many cases, the decision lies with the product’s owner, and the option to nudge people in the right direction is there for the taking.

framing, or presentation is all that matters

Framing essentially means that the exact same information can have a very different effect on your decisions depending on how it’s presented to you. This bias is so powerful that even experts fall for it sometimes. Test it for yourself: a “70% fat-free” yogurt sounds much healthier than the “30% fat” version, even though these are exactly the same. The statements 'After the operation, 9 out of 10 patients are able to keep the affected tooth in the long run' and '1 out of 10 patients will lose the affected tooth within three months of the operation' convey identical odds. Yet, the former sounds reassuring, while the latter instills alarm, often influencing decisions about undergoing the procedure. This is largely due to the brain's preference for energy conservation; our Analytical System is seldom triggered to scrutinize these statements mathematically.


working with this bias:

How you frame options profoundly influences decision-making: To make a choice more attractive communicate it in a way that evokes positive images and emotions. If you want to steer people away, make it look riskier. You can do all of this by simply phrasing the truth differently. Of course, marketers have long known that the way we say things can significantly influence consumer choices (as embodied in McCann's 'Truth well told' slogan), but behavioral science reaffirms that framing significantly impacts even the most serious, data-driven decisions. As caring for our own health is full with choices that necessitate long-term, probability-based thinking, it's vital to consider how the way you present an option affects emotional responses. Even when you’d expect the ensuing decision to be a very analytical, data-based one, you might find that no one actually bothers to look at the math.

mindlessness: the autopilot of everyday life

There are occasions when we instantly realize the need for thoughtful decision-making and our Analytical System quickly grabs the wheel to take control. No one tries to solve a quadratic equation based on intuition.

But in several cases we just do things on autopilot and deliberate consideration never actually happens. Most of our thousands of daily micro-decisions are made this way, bringing totally satisfactory results.

This autopilot state is called mindlessness.

Mindlessness only becomes a problem when we don’t recognize that a situation requires careful consideration and the decision we make with our Automatic System results in suboptimal choices.

It all starts with accepting the users’ lack of care. We can’t just expect that people will engage with our product in a mindful way. As harsh as it may sound, you can safely assume that people won’t care much about the details of your product and just want to get the benefits with as little thinking as possible. They will click through without reading the instructions and glance over important information.

mindlessness: the autopilot of everyday life

working with mindlessness:

Train people when and where they care about it. Make sure that anything users have carelessly dismissed can be found and checked again, preferably in the context where it’s needed. It's preferable to explain a function at the point of use, not during onboarding. Users only pay attention to information if it appears immediately necessary for achieving their goal.

No matter what you do, there will always be errors caused by mindlessness. Expect errors coming from lack of attention and help people recover from them easily. It’s not because users are dumb: they just don’t care enough to put too much cognitive effort into operating your product.

Let the user interface (UI) do the heavy lifting. The right UI can effectively guide users who click around in a mindless state by utilizing signifiers and well-crafted information architecture, as well as leveraging existing mental models and biases like the ones outlined in this report. A well-designed UI created by skilled Interaction Design experts and supported by proper usability testing can nudge inattentive users in the right direction.

We are social animals: the choices of others influence ours

Our brain evolved to its level of sophistication mostly to facilitate social interactions and cooperation. We dedicate an immense amount of energy to consider what others might think, and running mental simulations to imagine how they would respond to our possible actions. Social relations are so important to us that, when our basic biological needs are met, most of our higher-level goals and motivations revolve around them, such as pursuing status that can only be understood in a social context.

If enough people say something, we tend to agree, even if it invalidates our own objective perception. Imagine how easily we succumb to conformity when discussing subjective matters.

We don’t just affect each other on a conscious level: we influence others (and are being influenced) constantly without our knowledge. These unconscious social effects can be leveraged in product design with great results.

We can even be convinced by others to not believe our own eyes. In a famous experiment, participants were asked to compare the length of sticks. In 20-40% of the cases, a group of random strangers (research assistants in disguise) were able to change a person's mind about their judgment by unanimously voting for another option, even though their own senses told them otherwise. Invisible peer pressure is a formidable force to reckon with.

How to work with social biases? 

Showcase the smart decisions of others. 
Leverage the tendency to conform to steer people toward smarter decisions with a little digital peer pressure.

  • Users tend to follow what has worked for others. A badge reading “80% of patients chose this” can be an effective nudge. An even stronger way to promote a direction is to use social networks to point out acquaintances who have already chosen to go with that particular option.
  • Define the perception of what is considered the norm. Real-life experiments show that simply informing people that compliance is the norm can significantly influence them. A message like “92% of our users never skip a day in measuring their blood pressure!“ can provide the desired encouragement.
  • Most users want to perform well in comparison to others. Telling people that they are doing better than others usually motivates them to preserve their position. For example “You belong in the fittest 8%!”. In the same vein, disclosing that they lag behind the average urges them to catch up. For example “You have exercised less than 90% of our users. Let us help you catch up!”.
a nudge toward adherence by our loved ones

Complex decisions are hard decisions: our information-processing abilities and our patience are both limited 

Our brain becomes most confused when faced with complex dilemmas. The complexity increases with the number of alternatives and factors that need to be taken into consideration. It's easy to choose between two painkillers if they have the same qualities except one of them lasts longer and is a bit more expensive. It’s a personal outcome that we can easily imagine: “I have less money in my pocket, but my pain is gone for longer.”

However, when you need to choose from 15 different painkillers and you are considering a multitude of factors such as strength, effectiveness for different sources of pain, potential addictiveness and side effects, brand reputation, and recommendations from friends the decision becomes overwhelming. To resolve this, you tend to go with your intuition.

In situations like this, the right nudge in the right time can steer customers towards more beneficial decisions.

There is such a thing as too many choices

In a market economy, we take it for granted that it’s good for customers to have as many choices as possible. However, experiments show that’s not really true: we are generally happiest when presented with a few well-structured choices. We usually don’t like it if one option is forced on us, but too many choices can lead to decision paralysis.

Have you ever wasted 15 minutes in a supermarket, baffled by the bewildering number of fabric softeners? That’s why some brands are so successful – they transform a hopelessly complex analytical decision into a simple, emotional one. 

People follow different decision-making strategies: Maximizers want to make sure they pick the best possible solution and are anxious about missing out on something that might be even better. No wonder they are especially paralyzed by a wide array of alternatives. At the other end of the spectrum are Satisficers, who have a somewhat easier time, as they are fine with settling for a solution that is “good enough”.

there is such a thing as too many choices

How to assist complex decision-making? 

In a complex world where people are already overwhelmed with information, simplicity needs to be a primary goal of digital product design. More is not always better: aim to cut down the number of possible choices to simplify decisions. This is a true balancing act as people will have less opportunity to customize your solution — but product creators usually grossly overestimate the amount of customization people actually want. Simplifying decisions isn’t just good for the customers, it’s also profitable: in a famous real-life experiment, consumers faced with too many varieties of jam flavors started to buy less of it.

Helping users with complex choices:

  • Simplify the decision. Otherwise, people will automatically do it themselves, often at the cost of losing important information. Beyond a certain level of complexity, only a well-trained and motivated analyst could make a fully rational decision. The everyday user will typically rely on intuition, picking just a few arbitrary factors to base their decision on. As a healthcare expert, you are in a way better position to architect a simpler choice, helping users to decide based on the most important factors.
  • Provide recommendations. As a very strong nudge, you can recommend a solution and many will just take it (the more perplexed users are, the more they will lean on the function). What you base your recommendation on can obviously vary. One end of the spectrum is offering a few expert recommendations/crowd pleasers, which might not be the perfect fit for anyone but will be good enough for everyone. The other end would be data-driven recommendations, tailor-made for the given user — this solution can make sense if you have a massive product or service directory. But this only works if you possess enough data and know enough about your customers’ context and needs.

  • Connect choices to relatable outcomes and future personal experiences. People don't want an app with over 800 vitamin-rich recipes. They want easy meals to prepare, with plenty of variety, that will keep the family healthy. 
But which app is better suited to that goal: the one with 800 recipes or the one with 1200? For most of us, these numbers are meaningless. However, if you say your app contains a new dish idea for every evening for over two years, the information will be much easier to comprehend and a lot more relatable. In some cases, such as with different health insurance policies, it may be much harder to summarize the outcomes this way. However, it is worth the effort. A paralyzing dataset and policy pile will instantly be transformed into imaginable human experiences to choose from.

  • How you organize data and enable filtering is a nudge in itself. The way you enable people to filter possible choices will influence what kind of factors they’ll consider. Let’s say you have a website that allows users to browse doctors: if you prominently feature a filter for visit fees and years of experience, most people will accept that these are the main factors they should use to shortlist possible candidates. Yet, if you highlight customer ratings and availability, most users will arrive at a different set of finalists. Some customers will put effort into setting up their own criteria system, but you can safely expect that the majority will just go with the flow. Your job is to make sure this flow allows them to make the most beneficial choices.

Do you want to learn more about how to influence behavior with digital products?

read our full whitepaper