Today, David Brooks, writing about the financial mess in the New York Times, says that our behavior may not follow the old rules about rationality.
My sense is that this financial crisis is going to amount to a coming-out party for behavioral economists and others who are bringing sophisticated psychology to the realm of public policy. At least these folks have plausible explanations for why so many people could have been so gigantically wrong about the risks they were taking. . .
If you start thinking about our faulty perceptions, the first thing you realize is that markets are not perfectly efficient, people are not always good guardians of their own self-interest and there might be limited circumstances when government could usefully slant the decision-making architecture (see “Nudge” by Thaler and Cass Sunstein for proposals). But the second thing you realize is that government officials are probably going to be even worse perceivers of reality than private business types. Their information feedback mechanism is more limited, and, being deeply politicized, they’re even more likely to filter inconvenient facts.
Brooks makes a couple of errors in today’s column. He writes as if individuals and collectives make decisions in the same way. Individuals’ actions generally come in the midst of a flow of normal activities without deliberation and explanations. Explanations always come later. The processes of perception, decision, and acting may not be the discrete steps that he posits at the beginning of the article. Some biologists claim that they are all are tightly coupled within the cognitive processes.
Rather than think about a computer in the brain, we might instead think about a huge database that filters sensory inputs and sends a signal to a section of memory banks that stores action schemes that correspond to the inputs. If both the filters and action schemes reflect experience, it is not surprising that people seem to respond to more recent experience even when it leads to what is deemed irrational according to the norms of someone observing and analyzing the behavior in question. Any explanation vocalized to justify the action is guided by the immediate context and does not necessarily tell the story that actually was told in the cognitive system.
In this system, Brooks’s portrayal of prejudice as something that distorts our rationality is not only wrong but very misleading. He is arguing that we have
. . . perceptual biases that distort our thinking: our tendency to see data that confirm our prejudices more vividly than data that contradict them; our tendency to overvalue recent events when anticipating future possibilities; our tendency to spin concurring facts into a single causal narrative; our tendency to applaud our own supposed skill in circumstances when we’ve actually benefited from dumb luck.
Prejudices are precisely the filters that guide our thinking. Every action is guided by our historical accumulation of experience. When our acts don’t gibe with our explanations or what is taken as the norm by others, the anomaly is blown off as prejudice.
Institutional decisions are different. They may be presented formally as the act of a single decision-maker, but are almost always the result of discussion and are guided by a mesh of explicit rules. A rational calculus is explicit even if it is buried in a computer program. Explanations may precede the decision, as in appellate court judgments. In the financial crisis, it may not be the rationality of the actors that was faulty. Perhaps they were truly misled by the facts they were given. It was the computer’s unreality, not theirs, that was the problem.
And that is the point I make in my book. Our models of the world are always partial simply because we cannot get to know all we need to know via the methodologies we employ to support the concept of rationality. Sustainability depends on exchanging the very models that Brooks refers to for new ones that accept the complexity of the world and act accordingly, putting prudence before certainty. Even the Greeks knew this and had a special word for the kind of understanding required in public affairs. They called it phronesis, which comes close to “prudence” in our language.