Reality Filters 3: Sharpening and Leveling
Everyone is reasonable. That's why you shouldn't believe what you hear.
The previous post in this series focused on a fairly narrow issue caused by sampling bias: when we look at particular "big shots", whether they are famous songs, paintings, or people, we can't learn that much from them because of the sampling biases inherent in how they come to us. They present us with a distorted version of the world.
Sampling biases go a lot deeper than that. It pervades almost all the information that we encounter in our everyday lives.
The Goat Story
My earliest lesson in critical thinking came from a local news story about a goat.
The evening news reported on a young autistic girl who was losing her emotional support goat. A neighbor had complained, and the city had issued a warning to the family that they had to get rid of the animal. The segment had an interview with the parents, who expressed how much the goat helped their daughter. It showed clips of the girl hugging the goat. It ended with the parents saying they weren't sure what they were going to do without the goat.
My parents and I were watching when this story played. I don't recall what set it off, whether my mom had made a comment of sympathy or if the news anchor chastised the city or neighbor for their part in taking the goat away, but my dad became animated about what was wrong with the story.
Let me make sure I don't accidentally paint my dad as insensitive. The man spent his working years as an educator, and a good chunk of his career was specifically in special education. He dedicated his professional life to making sure kids get the resources they need to learn and flourish. As a parent, he was always gentle and rarely raised his voice—making him the go-to parent when my sister and I had screwed something up and had to report it to a grown-up. The point is, he is a compassionate guy with experience and sensitivity for kids with special needs.
That said, something about this goat story irked him. "Well, why did the neighbor file a complaint? Was the goat eating his plants, pooping on his lawn, being loud? Why is the city acting on this? The city doesn't just take away pets because of a complaint, it has to be against a bylaw. Did the parents know it was against a bylaw when they got the goat, or did they not check?" (Note: This is from approximately three decades ago and I'm relying solely on my memory of the event, it's unlikely this is an exact quote from my dad, this sounds nothing like him).
My dad's point was simple: the story was trying to paint a picture of the goat family being mistreated. It was glossing over a lot of details that would muddy that picture. If you knew the family was partly to blame because they had ignored (or not checked) bylaws, it would diminish your sympathy for them. If you knew the city was just acting under its laws, it would diminish the outrage at the city. If you knew the neighbor reported the goat because the damn animal woke them up at 4AM every morning on their front lawn after eating all the flowers and evacuating its bowels, and the neighbor had spoken with the goat family no less than five times asking them to control the animal—suddenly the neighbor doesn't seem like the bad guy.
Even if more details didn't change the overall shape of the event, they would make the story less impactful and therefore maybe not worth running on the news at all.
Even as a kid I internalized an important lesson from this episode: when a story portrays people as acting poorly, try to understand their motivations. Maybe you're missing some important detail—or maybe the story didn't unfold exactly how it's being told.
Sharpening and Leveling
When we tell stories, either anecdotes to friends or published news, we make choices about what details to keep, emphasize, or discard. We simply can't include every detail—the vast majority are irrelevant, boring, and you won't remember them. If I started a story about the funny way my toddler pronounced "cement mixer" by describing the placement of every object in the room when he said it, how many breaths I took during this event, the exact time and date down to the second, and all the events in the universe from the Big Bang up to that moment, you would stop listening far before I got to the pronunciation (it sounds like "mement misser", it's really cute).
To communicate is to make choices about what to communicate.
With the goat story, we can point to the choices made: the "point" of the story was to have the reader feel something (sympathy or outrage) for the family, and especially the little girl. The details focused on were the family's feelings and the negative impact on them. The details left out were all those that would diminish the point—they didn't include any of the details of the neighbor's or city's motivations because those could make the goat family less sympathetic, diminishing the feelings.
This is called sharpening and leveling.
We "sharpen" the details of the story central to the point. Every story has a point, some impetus that is compelling the speaker to open their mouths. Maybe it's to convey information, but more often it's to share some emotion. To tell a good story is to make sure the details that convey that emotion shine through. This might involve outright exaggeration or embellishment, but more often there is enough vagueness in an event that the way you choose to describe it gives enough "wiggle room" to sharpen. "She went to her room" can describe the same event as "she stormed off and slammed her door". Anything from "slightly annoyed" to "fuming" might be legitimate ways to describe the same outward reaction. The degrees of freedom we have from our choice of words allows us to shape the story to capture what we see as the emotional core. This isn't a bad thing—it allows us to signal to the listener what they should get out of the story. It adds clarity.
Similarly, we "level" the details that don't seem relevant to the story. They might be boring mundane details, but they also might be facts that mitigate the point we're trying to make with the story. If you're trying to express how frustratingly incompetent a coworker is, you probably won't mention the handful of tasks they've done perfectly well. Those would just muddy the underlying point you're trying to make: overall they can't do their job well! You instead focus on just the details showing the incompetence.
This process happens every time you hear an anecdote. That friend who told you the story of how unreasonable their boss was being. Or the one about how incompetent their coworkers were. How unfair the store clerk was. How ridiculous so-and-so is being about the scheduling of an event. Etcetera, etcetera.
The point isn't that everyone's a liar and you shouldn't believe anything you hear. If people didn't sharpen and level, it would be impossible to communicate. Imagine if scientific papers tried to describe in excruciating detail every aspect of every trial of every experiment. You would lose the forest for the trees and never learn what it all added up to. Same with anecdotes or news stories. The kernel we abstract away from a set of events and decide is worth telling is important. Telling all the details makes the story get bogged down until the point is lost.
But it's also important to recognize the selection pressures on the information people give you. They want you to understand how they feel, convey a viewpoint or emotion through the story. They also want to keep it interesting. These introduce a sort of sampling bias. Even when a story is an accurate description of events that actually happened, there's still bias in what and how the storyteller presents the facts. We aren't getting an unvarnished view of the world.
The stories people tell are more than descriptions of reality. They are entertainment, bonding, and venting. But given that our first-person experience is limited, stories also give us a way to expand our experiences to learn more about the world. When should we trust the details of a story to tell us something true about the world? For that, we need a truth serum.
Bayesian Truth Serum
The goat family news story didn't explain the actions of the city and the neighbor. It left their side of the story out.
One lazy way people sometimes try to correct for these kinds of biases is trying to convey "both sides" of an issue. Presenting both sides of an issue on equal footing has its own issues, as parodied by SMBC Theater:
Both sides aren't always equal. When there is a disagreement on facts, the answer isn't to present both sides on equal footing. It's to figure out what the truth of the matter is. The truth of the matter should involve understanding why the other side has a different view.
Even when viewpoints are on equal footing, a good story might involve many sides—a story complaining about the bureaucracy at work might involve your manager, the Human Resources department, old policies developed by people no longer at the company, and decisions by company leadership. Not to mention you don't have access to all sides—you can't litigate your friend's anecdote about the mean barista by marching over to the coffee shop and demanding the other side of the story (I mean, you can, it would just be weird and no one would want to be friends with you if you behaved like this and you would die alone).
Drazen Prelec from MIT has what he calls a "Bayesian Truth Serum" for dealing with situations where the truth is unclear. The cases Prelec is concerned with are those where you have a bunch of opinions on, for example, whether specific legislation is going to pass. The idea is to ask everyone two questions: what they think the outcome will be, but also what they think others think the outcome will be.
This is a simple test of meta-knowledge. Those who have a better model of what others think are likely to have a good model of the overall issue.
While Prelec is mainly thinking of forecasting elections or making medical diagnoses, this gives a good framework for determining reality in everyday life. We should put more trust in people and stories where there is some level of meta-knowledge.
If two people are in a disagreement, and one can explain where the other is coming from, that person probably has a better handle on the points of disagreement.
If you hear a story about frustrating, byzantine rules, if the person can describe the incentive structure that produced those rules, you should probably accept their descriptions (and proposed solutions) more readily.
If someone tells a story and everyone else is incredibly unsympathetic and acting irrationally, and the only reason they can give for it is "they're stupid", you might not want to put too much weight on the picture of the world they are giving you in their story. They're probably mad and venting, and a lot of important details are being leveled and other details being uncharitably sharpened.
You can't respond to every anecdote with questions to reveal the speaker's metaknowledge (okay, you can, but again, it would be weird and no one would want to be friends with you and you would die alone). Even if you could, it wouldn't work for news stories and the like. But you can ask yourself how well you understand the dynamics of the situation. Is there an explanation for the people in the story acting the way they did that's consistent with them being reasonable? If not, you have to wonder whether you're getting an accurate picture of what really happened.
We can apply this to ourselves as well.
Our memories are also subject to sharpening and leveling. We don't retain and recall every detail of an event. We have biases in how we remember. But we can ask ourselves how well we understand an issue by testing our own metaknowledge. That stupid process at work you think should be nixed—do you know how it came to be? Can you explain why the people that enforce it do so? If you can't do it without invoking people acting really unreasonably, ask yourself if you really understand it.
This is all premised on the idea that people are generally reasonable. I really believe that. I mean it in the literal sense—they have reasons for the things they do. People have incredibly stupid beliefs and do incredibly stupid things. But they believe and do those things for reasons. We are all navigating a complex world full of competing voices. We're subject to the biases of our fallible memory and unconscious motivations. We exist in complicated cultures, societies, and organizations with complex incentive structures. Some people could come to the wrong beliefs not because they are inherently unreasonable, but because the dynamics at play in what we're exposed to in our experiences are a filtered version of reality (see what I did there? Referring to the title of the series? Please clap). Try to understand the mechanisms for why they came to their beliefs. Have sympathy, be charitable, because even those you disagree with believe what they do for reasons.
We're all reasonable people just trying to make our way in this complicated world. Give each other credit as reasonable. But also understand the selective pressures at play when people tell you stories so you don't believe just any crap you hear.
That's a great test to apply to see if you understand opposing viewpoints. I've always thought of that as standing in their concept space, adopting their assumptions, and then judging its merits. So many people can't do that, drop their own assumptions when judging someone else's viewpoint.
If I can't see how their position makes sense or hangs together, I probably don't understand it. And if I don't understand opposing viewpoints, then my position isn't reasonable.
I love this. I’m reminded by this of Chesterton’s fence, the parable that that if you see a fence blocking a road, and you don’t know why it’s even there, you should hold off on tearing it down until you can figure out why it was placed there. Maybe those reasons no longer apply—in which case, demolish away—but if you literally cannot imagine a legit reason why the fence was constructed, you don’t know enough about the situation to act.