0%

Who is Daniel Kahneman?

Daniel Kahneman is a psychologist and economist. He’s the Professor of Psychology and Public Affairs at Princeton University.

In 2002, he won the Nobel Memorial Prize in Economic Sciences. He created the field of behavioral economics with his colleague Amos Tversky.

Kahneman’s work is often summarized as “people are irrational.” That’s not really true. He never said that people are wild and chaotic, but that we often contradict ourselves and make systematic errors in judgement.

Now let’s dive into the first great lesson from this book!

🧠 1. Your Mind’s Two Systems: 1 is fast and intuitive, 2 is slow and analytical

Throughout Thinking, Fast and Slow, Kahneman shares example questions we can use to test our own minds as we’re reading. The examples demonstrate the psychological concepts he later explains.

Here’s the first one:

A bat and ball cost $1.10. The bat costs one dollar more than the ball. How much does the ball cost?

Over 80% of university students give the wrong answer to this question. Most people immediately feel the ball costs 10 cents and the bat costs 1 dollar. In this case, our intuition is wrong. Finding the correct answer requires a little more logical thinking. (The correct answer is that the bat costs $1.05 and the ball costs 5 cents.)

This example illustrates the two systems in our mind:

  • System 1 is fast, automatic and intuitive. It’s always working and recognizing connections between things. While System 1 is usually right, sometimes the quick judgments lead us to wrong conclusions, as in the example.
  • System 2 is slow, analytical and lazy. It’s usually turned off and only engages when heavier thinking is needed, like if someone asked you to multiply 19 by 33. System 2 is also our self control, regulating our impulses. Scientists know this system is working when someone’s eye pupils dilate or heart rate increases, both signs of cognitive effort.

Please note these two systems are like nicknames or useful shortcuts. They help us explain the tendencies of the human mind. However, they are not physically separate areas in the brain.

Stanford Professor Robert Sapolsky describes an area in our brain just behind our forehead called the frontal cortex. This part of the brain has evolved most recently and it is larger in humans than other animals. It is probably what makes human intellect so unique in the animal world.

The frontal cortex is responsible for long term planning, strategic decisions, regulating emotions, resisting impulses and more. Sapolsky writes, “The frontal cortex makes you do the harder thing when it’s the right thing to do.” To me, it sounds like the biological reflection of the System 2 talked about in this book. To learn more about the biological side of the brain, read our summary of Behave by Robert Sapolsky.

Our mind usually processes information through System 1 which is quick and intuitive, but vulnerable to mistakes in some situations. When more cognitive effort is needed, then our mind reluctantly turns on System 2, which is more slow and analytical.

😌 2. Cognitive Ease: Assuming something is true if it seems familiar

Our memory is a machine that makes constant connections. When we hear one idea, it activates other ideas nearby in our mind. One result is that ideas we have heard repeatedly before can feel intuitively true only because they spark recognition. Kahneman calls this Cognitive Ease and mostly attributes it to System 1.

Remote Association Testing

Scientists have created a Remote Association Test (RAT) that demonstrates the feeling of cognitive ease. For example, can you sense a connection between these three words:

Cottage, Swiss, Cake

Now how about these words:

Dream, Ball, Book

When most people hear the first three words, they can intuitively sense they are somehow related. This feeling of cognitive ease happens even before they can think of the word that connects all three, which is cheese. The second set of words does’t have any overlapping connection, so you won’t get that same intuitive feeling.

Repetition = Truth?

Unfortunately, cognitive ease makes us vulnerable to being influenced through simple repetition. If we hear something repeated over and over again, then mental connections are reinforced in our minds. Eventually, the idea eventually feels like it is true. The psychologist Robert Zajonc called this The Mere Exposure Effect.

Throughout history oppressive governments, news media and advertising companies have taken advantage of this quirk of human nature. As Kahneman puts it:

A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth.

According to Wharton Business School professor Jonah Berger, one of the most powerful marketing techniques is creating triggers. This means creating a mental connection in customer’s minds between a product and another idea.

For example, around 2007 sales of Kit Kat chocolate bars were falling, so Hershey hired Colleen Chorak to rescue the brand. She knew most people drink coffee multiple times per day, so her brilliant idea was to make people think of Kit Kat every time they drank coffee.

Professor Berger described her ad campaign like this: “The radio spots featured the candy bar sitting on a counter next to a cup of coffee, or someone grabbing coffee and asking for a Kit Kat. Kit Kat and coffee. Coffee and Kit Kat. The two spots repeatedly paired the two together. The campaign was a hit. By the end of the year it had lifted sales by 8 percent. After twelve months, sales were up by a third. Kit Kat and coffee put Kit Kat back on the map. The then-$300 million brand has since grown to $500 million.” If you want to learn more about the psychology of effective marketing, then read our summary of Contagious by Jonah Berger.

Cognitive ease means our System 1 automatically recognizes when two ideas are closely connected. Unfortunately, simple repetition of an idea can make us feel it’s true just because it’s familiar.

🎯 3. Priming: Subtle stimuli can unconsciously change our actions

This brings us to the phenomenon of priming. Researchers have found that exposing people to one stimulus changes how they later respond to another stimulus. This happens automatically, below our conscious awareness.

For example, when people are shown the word EAT and they are asked to complete the word fragment SO_P, then they are likely to write SOUP.

On the other hand, when they’re first shown the word WASH, then they’re more likely to write SOAP.

Anchoring

Anchoring is when a person’s decision is highly influenced by an earlier piece of information they were given.

For example, one of Kahneman’s studies took place in San Francisco’s Exploratorium. They asked visitors to donate to a charity helping save seabirds from oil spills.

  • To some visitors, they simply asked for a contribution. In this case, people donated an average of $64.
  • To other visitors, they began by saying “Would you be willing to pay $5…” then asked for the contribution. These people donated an average of $20.
  • And to other visitors, they began with “Would you be willing to pay $400…” then asked. These people donated an average of $143.

As you can see, mentioning either a low or high dollar amount beforehand made a huge difference! That first number is called an anchor, it sets an expectation in people’s minds, which greatly affects their later decision of how much to donate.

Psychology professor Robert Cialdini says that salespeople often use anchoring. For example, car dealers try to finish negotiating the price of the car before talking about extras like tires and air conditioning. Why? Because the price of the car establishes a high anchor of $30,000, which later makes $500 feel small by comparison.

In his book Influence, Cialdini writes, “There is a principle in human perception, the contrast principle, that affects the way we see the difference between two things that are presented one after another. Simply put, if the second item is fairly different from the first, we will tend to see it as more different than it actually is.” To learn more techniques useful for marketing and sales, read our summary of Influence by Robert Cialdini.

Priming means one stimulus can influence how we respond to the next stimulus. Anchoring means simply hearing a high or low dollar amount can set our expectations and affect how much we pay for something.

💡 4. Availability Bias: Relying too much on how easily information comes to mind

The availability bias means we judge something as either more important or more true based on how easily examples of it come to mind.

Terrorism is effective because of availability bias. In Thinking, Fast and Slow, Kahneman says one time his country of Israel was hit by several suicide bombers targeting buses. He intellectually knew the probability of him being killed by a bomb was lower than the probability of dying in a random car accident. Yet he was still affected by the vivid news stories and found himself accelerating faster than usual past buses.

Media coverage of killings and accidents makes us feel they are far more common than they really are. Car and plane accidents receive far more clicks and therefore advertising dollars than a routine disease death in a hospital. Yet the fact is, diabetes alone kills 4 times as many people as all accidents combined!

Confirmation Bias

How do we find out if a statement is true? Our System 1 searches for positive matches in memory. This is a form of confirmation bias.

  • For example, if people are asked “Are you a friendly person?” then they automatically try to remember situations when they were friendly.
  • On the other hand, if they are asked “Are you an unfriendly person?” then they will search their memory for times they weren’t nice.

So our System 1 works by searching for examples that confirm a statement. Unfortunately, this process makes us temporarily blind to any counter-examples. This human tendency to have blind spots is called “What You See Is All There Is” by Kahneman, and he says it leads to to overconfident beliefs on the basis of weak evidence.

In the book Pre-Suasion, professor Robert Cialdini says one of the most powerful persuasion techniques is directing a person’s attention. He writes, “We are said to ‘pay’ attention […] when attention is paid to something, the price is attention lost to something else. Indeed, because the human mind appears able to hold only one thing in conscious awareness at a time, the toll is a momentary loss of focused attention to everything else.” Learn more about the science of attention and persuasion by reading our summary of Pre-Suasion by Robert Cialdini.

Availability bias means we check if a statement is true by how easily examples come to mind. This makes us overestimate risks from rare events that receive lots of media coverage, like shark attacks and terrorism. We also become blind to counterexamples, a form of confirmation bias.

🔀 5. Substitution: Answering an easier question than the one asked

System 2 is lazy and it doesn’t engage more than necessary. Kahneman says one consequence of this is our minds use many intuitive heuristics, which means mental shortcuts that save time and effort.

This is the essence of intuitive heuristics: when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.

For example, in one study German students were asked two questions:

  1. How happy are you in life?
  2. How many dates did you go on last month?

When they were asked the questions like this, the researchers found almost no connection between quantity of dates and happiness. But everything changed when they flipped the order of the questions. They asked:

  1. How many dates did you go on last month?
  2. How happy are you in life?

This time, they found a very strong connection between dates and happiness. What’s going on?! Kahneman says the question “How happy are you?” is simply too complicated for a quick answer. Think about it: you would need to list all areas of your life, rate them and then add it up.

Instead, the students use a shortcut, substituting an easier question for the one they were asked. The easier question appears to be “How do I feel right now?” And this is why changing the order of the questions has a strong impact. If someone has just been reminded they went on no dates last month, they feel worse, so they answer they’re less happier in life.

System 2 is lazy, so we use mental shortcuts called heuristics. One of those is answering an easier question than the one asked. For example, students answer “How happy are you in life?” by checking “How do I feel right now?”

📊 6. Base Rate Neglect: System 1 uses stereotypes, not statistical thinking

Here’s another fascinating exercise from one of Kahneman’s studies, that also demonstrates substitution of an easier question.

The first part is to imagine there’s a random graduate student named Tom W and you must guess which field he is studying. Rate the probability from 1 to 9 that he is in:

business, computer science, engineering, humanities, law, medicine, library science, life sciences, social science

Most people answer this question by ranking the fields from most to least popular, because we have no other information about Tom to use. In statistics, these are called the base rates—the estimates of the sizes of each category.

The second part of this exercise is to read this description of Tom W, then rank his probability of being in each field again from 1 to 9. However, keep in mind this is based on unreliable psychological tests.

Tom W is of high intelligence, although lacking in true creativity. He has a need for order and clarity, and for neat and tidy systems in which every detail finds its appropriate place. His writing is rather dull and mechanical, occasionally enlivened by somewhat corny puns and flashes of imagination of the sci-fi type. He has a strong drive for competence. He seems to have little feel and little sympathy for other people, and does not enjoy interacting with others. Self-centered, he nonetheless has a deep moral sense.

When most people complete this exercise, they guess Tom is probably studying computer science or engineering. That’s wrong!

This error highlights how our System 1 can mislead us. The visual image created by the description creates an overwhelming intuition from our System 1 that Tom is in certain fields. He fits our mental prototype of a computer science student. Unfortunately, this feeling blinds us to other information that should be part of our answer—namely the base rates.

Statistical Thinking

You can guess Tom W’s university field with your System 2 and statistical thinking by:

  1. Starting with your base rate estimates from the first part of the exercise.
  2. Then adjusting FROM the base rate estimates based on the description. You could raise the probability he is in a field like computer science, but since the description was labelled “unreliable,” you should only adjust a little.

The conclusion? An informed guess of Tom W would still place him as more likely to be in the more popular fields like humanities or social science, rather than less popular fields like computer or library science.

Yes, this is counterintuitive. If you feel a little confused, don’t worry! Kahneman shows this problem to many top students and colleagues and they are also blinded by the wrong intuitive answer. That’s the point of this exercise!

(If you want to take your knowledge deeper and learn how to answer problems like these using math formulas, then look up tutorials for Bayes Theorem online.)

When calculating probability, our System 1 intuition compares things to representative stereotypes in our mind. On the other hand, statistical thinking begins with base rate estimates of the underlying categories, then adjusts up or down based on additional information.

📖 7. Narrative Fallacy: Making oversimplified stories of the world

The economic thinker Nassim Taleb coined the term “Narrative Fallacy” to describe our human tendency to build narratives or stories for why things happened.

We have an urge to find the cause of everything. Unfortunately, it’s easy to make up an explanation after something happened even when there was no real cause and effect relationship. This creates an illusion that our world is more predictable or certain that it really is.

Kahneman shares a story from Nassim Taleb to illustrates the narrative fallacy. Many years ago, on the day Saddam Hussein was captured in Iraq, investment bond prices happened also rose. So Bloomberg News published the headline “US Treasuries Rise; Hussein Capture May Not Curb Terrorism.”

Then half an hour later, bond prices fell. The headline was revised to “US Treasuries Fall; Hussein Capture Boosts Allure Of Risky Assets.” This raises the question: How can one cause (Hussein’s capture) be used to explain two contradictory effects? Because it really was no explanation, only the human desire to find a causal connection between the major events of the day.

In Nassim Taleb’s book The Black Swan, he says the true test of knowledge is being able to predict an event beforehand, not explaining it after the fact. Taleb wrote, “The way to avoid the ills of the narrative fallacy is to favor experimentation over storytelling, experience over history, and clinical knowledge over theories.” If you want to hear more fascinating ideas about economics and uncertainty, then read our full summary of The Black Swan by Nassim Taleb.

Regression to the Mean

Many events that are explained through narratives can be more easily explained through a statistical idea called regression to the mean. In other words, a return to the average.

For example, many sports fans believe in the “Sports Illustrated Cover Curse.” This is a legend that says athletes who appear on the cover of the magazine later fail to keep performing. Anybody can find many examples that appear to “prove” this curse is true.

However, the phenomenon can be easily explained through regression to the mean. Think of it like this: athletes are chosen for the cover because they have recently shown spectacular performance. That performance appears to be skill, but it’s often a streak of random luck. So over time, they naturally return to their average score. Our failure to see that many events are essentially random makes us create causal explanations when the real reason is simple probability.

The Narrative Fallacy is finding causes to explain events, even when the true cause was probably randomness. These stories create the illusion the world is more predictable than it really is.

️ 8. The Affect Heuristic: How we feel shapes what we think

We humans like to believe we are rational beings, but the truth is our reasoning is often controlled by how we feel.

The Affect Heuristic says we use our emotional state as a shortcut in decision making. How we feel about something changes which facts we will accept or reject about it. This is how our mind keeps ideas consistent and coherent, so that our everyday choices are simpler.

In a study, psychologist Paul Slovic discovered when people are asked about new technologies like water fluoridation, their views of the risks and benefits are unusually consistent. In other words, if they believed something was safe, they also believed it had great benefits. But if they believed something was risky, it was hard for them to recognize any benefits.

Remarkably, changing people’s feelings about one side of the argument also changed their feelings about the other side. Convincing people of the benefits of a technology also made them less concerned about the risks.

The Halo Effect

The Halo Effect is a cognitive error when one positive thing about someone highly influences our overall judgment of them. For example, many studies have found people rated as more physically attractive are automatically seen as more intelligent.

It is another way our mind tries to create a coherent or consistent picture of the world. If someone is attractive, then we feel good about them, and that makes it harder to fit in negative facts about them.

On the other hand, when we hear a statement like “Hitler like to play with cute puppies,” that is hard to fit into our picture of him as the ultimate evil.

The Affect Heuristic means our emotional state motivates our thinking. If we feel good about something or someone, then it’s harder for us to believe negative facts about them. Like physically attractive people automatically being rated as more intelligent.

🎓 9. Illusions of skill: When are expert intuitions reliable?

Psychology researcher Gary Klein wrote the book Sources of Power, which examined why some experts appear to have almost magical intuition.

For example:

  • A firefighting captain had a bad feeling inside a burning house, he began yelling for his team to get out, and moments after they left the floor they had been standing on collapsed.
  • Chess masters seem to know the perfect move immediately without much thinking, out of thousands of possible moves.
  • Experienced doctors can often glance at a patient and instantly guess the correct diagnosis.

Illusions of skill

On the other hand, Daniel Kahneman says many fields contain mostly an illusion of skill, including professional investing, clinical psychology, counselling, political science, etc.

For example, his analysis showed there is no correlation between investment fund managers and the performance of their fund over time. Despite this, fund managers receive handsome bonuses if their fund appears to perform well in a given year. The entire field seems to actively ignore their field is based on an illusion of skill, perhaps because their pay checks depend on believing the illusion.

Another study by Philip Tetlock tracked predictions from 284 political experts over 20 years. The results were terrible, with the predictions being less accurate than random chance. All their accumulated knowledge and degrees did not make them better forecasters than the average person on the street.

Fast feedback is necessary for building skill

Why did Gary Klein see magical expert intuition everywhere, while Daniel Kahneman could only find illusions of skill? They decided to collaborate on a paper together to find out.

Long story short, they discovered the key to developing real expert intuition was fast feedback:

  • Gary Klein studied experts who received feedback quickly, which allows learning to happen. A chess master knows within seconds whether they made a smart move or not.
  • Daniel Kahneman studied experts who received very slow feedback, or none at all. An investor may not know if they picked the right stock until years later. This lack of feedback prevents learning and the development of accurate intuitions.

By the way, most top personal finance books today recognize the fact professional fund managers (aka stock pickers) don’t help our savings grow faster. So they recommend we put our money into index funds instead. An index fund is a collection of stocks, where nobody is picking individual stocks, rather you’re basically buying a small piece of every stock on the market.

Personal finance expert Andrew Hallam says a 15-year long study found that “96 percent of actively managed mutual fund underperformed the US market index after fees, taxes, and survivorship bias.” Learn the fundamentals of smart investing in our summary of The Millionaire Teacher by Andrew Hallam.

The main difference between true expert intuition and the illusion of skill is fast feedback, which enables learning. Doctors, firefighters and chess masters can see the results of their choices quickly. Investors, political experts and psychologists often need to wait years.

🤖 10. Algorithms: Simple formulas usually beat expert judgments

In 1974, psychologists Howard and Dawes did a study which found that a simple formula could predict marital happiness better than relationship counsellors. Here it is:

frequency of lovemaking minus frequency of quarrels

Fill out that formula. A positive number means a happy marriage, but a consistently negative number reliably predicts upcoming separation or divorce.

Psychologist Paul Meehl first demonstrated how simple algorithms could outperform expert predictions. For example, he found in predicting the future grade of college students, counsellors were far less accurate than a simple algorithm. Over 200 more similar studies were conducted and the algorithms beat the experts 60% of the time. The rest of the results were a draw, but the algorithms were still significantly cheaper than paying experts.

In 1955, Kahneman had the job of designing a new interview process for the Israeli Defense Forces. Their old process was interviewers getting to know recruits over 20 minutes, then making a personal judgement about whether they would be good in the army. This was found to be ineffective.

Kahneman’s new process instructed interviewers to ask specific factual questions (about work experience, sports participation, school punctuality…) then score each answer from 1 to 5. The scores were fed into an algorithm, which produced the final judgment. The interviewers were unhappy about this robotic new style of working, but the tests became far more effective predictors.

Most of the time, simple algorithms can provide more accurate predictions than experts. Build your own algorithms by gathering facts, then weighing the facts according to a standard formula.

💎 11. Prospect Theory: The inconsistency of human wants

For a long time, economists used expected utility theory. This theory describes how perfectly rational humans would make decisions. Then Kahneman and his colleague Amos Tversky changed everything by creating Prospect Theory. This is the work Kahneman won his Nobel Memorial Prize for.

Prospect theory was based on scientific studies so it recognized humans don’t always behave rationally. They recognized there was not just rational utility, but also a psychological value humans place on different choices.

Here are the 4 most important insights from Prospect Theory:

a) Loss Aversion

Imagine you were offered a 50/50 gamble. There is a 50% chance you will lose $100 and a 50% chance you will win $100. Would you take the bet?

Most people wouldn’t. Contrary to expected utility theory, real people don’t value gains and losses equally. We hate losing money more than we like winning the same money.

Experiments demonstrated the average person feels 2-3 times more strongly about losing than winning. This means for someone take a 50% risk of losing $100, they would need to be offered a 50% gain of at least $200 to $300.

b) The Certainty Effect

Imagine you were offered a choice between winning $90 for sure, or a 95% chance of winning $100?

According to expected utility theory, the value of each choice should be calculated as probability times value. So in this example, 95% chance to win $100 equals $95. The gamble theoretically beats the $90 sure option.

But in reality, most real people would choose the sure thing. This reflects the psychological value humans place on avoiding uncertainty. In situations where we gain money, we are risk averse.

c) Risk Seeking, if all options are bad

However, now let’s flip the previous example around.

Imagine you were offered a choice between losing $90 for sure, or a 95% chance of losing $100 (which means a 5% hope of losing nothing).

In this case, most people suddenly don’t want the sure thing. They would rather risk losing even more money because of that slim hope of losing nothing. So when all options are bad, people suddenly become risk seeking.

d) The Possibility Effect

Finally, people also tend to over-value small possibilities.

This is why many buy lottery tickets. The amount paid for the ticket is more than the the rational calculation of probability times value, that’s how the lotteries make money. However, we humans put a psychological value on the mere possibility of winning that large amount.

This is also why people buy insurance. Obviously insurance companies make a profit by taking in more money in insurance premiums than they pay out. However, there is a psychological value for us in eliminating the possibility of a catastrophic one-time loss.

Prospect Theory demonstrated how psychological value affects human choices, rather than just rational utility. We prefer avoiding losing money than gaining the same amount, we prefer certainty, we care about small possibilities, but if all options are bad then we become risk-seeking.

🖼 12. Framing Effects: How changes in wording affect our choices

Framing means how an option is worded can drastically change whether people choose it or not.

For example, Amos Tversky conducted a study with physicians at Harvard Medical School. They were given a description of a patient and a surgery and asked whether they would recommend the surgery.

Half the physicians were told the surgery had a “90% survival rate”, the other half were told it had a “10% mortality rate.”

The two statements actually mean the same thing, yet that small change in wording caused a big difference. When the surgery risk was framed in terms of “survival,” 84% of physicians recommended it, but when it was framed as “mortality,” only 50% did. The doctors would have known the two statements were logically identical if shown side-by-side, which makes the results even more striking.

The framing effect says how choices are worded has a big effect on our preferences. Doctors were more likely to recommend a surgery with “90% survival rate” over one with “10% mortality rate,” although both statements are identical.

🔦 13. The Focusing Illusion: We over-value whatever we’re thinking about

Kahneman finishes the book with a key insight:

Nothing in life is as important as you think it is when you are thinking about it.

This is called The Focusing Illusion: We overestimate the effect of whatever we are focusing on right now.

For example, did you know that paraplegics are not less happy than other people? That is difficult for many to believe, because right now we are focusing on the one main difference with paraplegics, which is not being able to use their legs.

However, in their daily lives, paraplegics themselves are NOT focused on that. They are focused on the other parts of life: a conversation, a movie, a frustration, etc. After about a year, paraplegics get used to their condition, which means they barely think about it, so it doesn’t affect their happiness much.

In the same way, most of us have had the experience of buying a new car or gadget, feeling excited at the beginning, then having it fade into the background of our life. This is a phenomenon the psychologist Daniel Gilbert has cleverly named miswanting.

The Focusing Illusion says “Nothing in life is as important as you think it is, when you are thinking about it.” This means both positive and negative events have a smaller long-term impact on our happiness than we expect, because we stop focusing on things.

Conclusion

This book was over 400 pages! Many online reviewers have called it difficult or tedious to read, but I’m glad that I did. Kahneman is a brilliant thinker, inventing many of the ideas that other writers have popularized.

If you enjoyed this summary, then the book itself can help you learn these ideas more deeply through countless examples, case studies and stories. I recommend it, if you have the patience! I also think you’ll love Robert Cialdini’s books, which include Influence and Pre-Suasion.

Thanks for checking out your free preview!

Want more? Get the extended summary of 'Thinking, Fast and Slow' and many other top business and self-help books with a Growth Summary account.

It's quick to sign up, just 30 seconds.

Get Started Free

More 🚀 growth
in less time.

You're busy. We get it. But you still love to learn and want to read more books.

And that's where our book summaries can help. Understand the best lessons from the best books... in minutes, not hours.

Get Started
Growth Summary on a phone

Are you ready to upgrade
every area of your life?

From ancient wisdom to modern science, we study every area of human knowledge. So you can be inspired every day with the best ideas that really help you grow.

Business Finance Investing Entrepreneurship Leadership Sales Psychology Meditation Happiness Relationships Love Productivity Habits Communication Influence Motivation Health Nutrition Science History Philosophy

What you're getting with Growth Summary Pro

Super-detailed book summaries, focused towards your growth

📖 Read 100+ professional book summaries

🧠 Detailed, yet short. Enough detail for you to learn the best ideas from the book. Short enough to keep things fun and light!

💡 Easy to understand. Clear and simple writing. Lots of bullet points. No long boring paragraphs. Even visuals, illustrations and comics!

🤔 Context and critical analysis. Connections to ideas from related books. Unique commentary and counter-arguments that you won't find anywhere else.

Start reading free

Growth Summary features for reading
Growth Summary features for listening

🎧 Listen to enthustiastic audio summaries

🗣️ Engaging and lively. Our passionate writers record the audios themselves. (Other services use a robot voice.)

🚗 Learn on-the-go. Learn while you're driving, walking, washing dishes, or just relaxing.

⏩ Go 1.5x speed or faster. Do you usually listen to audiobooks or podcasts at a faster speed. We've got that feature, too.

Start listening free

📚 Even more helpful features

🗒️ Skim 1-page CHEATSHEETS! Get a quick overview of a book's key takeaways. Refresh your memory of books you've read before

🎯 Practical Action Plans. Transform knowledge into results with a ready list of action steps at the end of the book summary.

💖 Personalized recommendations. Discover more new books customized to your reading interests and habits, right on our website!

Start growing free

Growth Summary more features for learning

Typical Book

300+ pages
10-15 hours
Read only

Book Summary

Best lessons
45 minutes
Read & listen

You already spend money and time on books.
We'll help you maximize that investment.

Let's do the math together:

If you buy

2 books
a month
Then you spend

$40 dollars
and
20 hours
a month
Which equals

$480 dollars
and
240 hours
a year!

The good news is, our service costs a small fraction of that! Plus, you can cancel anytime with 1-click. So you risk almost nothing by giving our book summaries a try. Go ahead, click this shiny yellow button and let's start growing together!


Get Started

Join a community
that is worldwide and world-class

Every year, Growth Summary is read by over one hundred thousand people!

Get Started

Frequently Asked Questions

What happens after my 30-day free trial?

After your free trial ends, your chosen plan (monthly or yearly) will automatically begin, and your card will be charged.

How can I cancel my free trial or subscription?

You can cancel your trial or subscription at any time in your account settings with one easy click. You can also cancel by contacting us. If you cancel before the trial ends, you won't be charged.

How do you accept payments securely?

We accept all major credit cards and payments via Stripe. Stripe is a globally recognized and trusted payment platform, handling billions in transactions each year. It is a payment processor of Amazon, Google, Salesforce, Airbnb, Spotify, Uber, Lyft, and countless others.

Is there a limit to how many book summaries I can read per month?

Absolutely not! Once you subscribe, you can read as many book summaries as you like. There's no limit. Happy reading!

Will the book summaries be updated regularly? Can I suggest books?

Yes, we add new book summaries to our collection every month. As a premium member, you can also suggest books for us to summarize. We can't guarantee we'll cover every book, but we'll certainly consider all suggestions.

Do you have an app I can download?

As of now, we don't have a standalone app. However, our website has been optimized for all devices, providing you a seamless experience whether you're using a computer, tablet, or mobile device. This approach ensures our summaries are accessible to you anytime, anywhere without the need for downloading an additional app. Plus, this way we are able to instantly deliver updates and improvements to all users simultaneously.

And did you know: You can add our website to your phone's home screen, just like an app! Here's how:

  1. Open growthsummary.com in your browser on your phone.
  2. Tap on the 'Share' button on iPhone or the menu button on Android.
  3. Then select 'Add to Home Screen'.

Now, you can access our book summaries with just one tap, just like you would with an app! And there's no need to download or update anything, ever!

What if I decide to switch between the Monthly and Yearly plans?

You can change your plan in your account settings page. The changes will take effect at your next billing date.

I'm not sure if the service is worth the price.

With our service, you're investing in yourself, your future growth, and saving time. Furthermore, compared to the cost of buying individual books, our service provides great value. And don't forget, we offer a free trial for you to test out the service and see if it meets your needs!

Get Started

Community Notes

Add Your Note

Share