Inside every decision you make, two systems are arguing. One is quick, instinctive, and almost always wrong about probability. The other is deliberate, effortful, and easily exhausted. The interesting question is no longer which one you should trust — it's what happens now that machines have learned to do both.
— effortless, automatic, always running.
— effortful, slow, the self you believe you are.
We are blind to our own blindness. We have very little idea of how little we know.— On the limits of introspection
Whatever value enters your head first quietly drags every estimate after it toward itself — even when you know the number is irrelevant.
Vivid, recent, or emotional examples are mistaken for common ones. Plane crashes feel more frequent than car accidents.
We will take strange risks to avoid a small loss, and refuse fair bets that would, in aggregate, make us richer.
Asked something hard, the mind quietly swaps it for a question it can answer — and presents the answer as if it were the original.
An attractive speaker seems smarter; a confident CEO seems more competent. The first impression rents the rest of the room.
After the fact, the world feels inevitable. We rewrite our prior beliefs to match what we now know — and learn the wrong lessons.
A large language model is, in a strange way, the first artefact in human history that does both fast and slow thinking — pattern-matching at the speed of intuition, then reasoning at the pace of deliberation. It inherits our biases because it learned from us. It introduces new ones because it isn't us.
The original question of the book was: when can you trust your intuitions? The new question is harder: when can you trust someone else's, scaled to a billion users?