I tried hard to stay low-key for the first few entries. I really did.
But when it comes to weighing in on online conflict, or current events, I can barely say anything without referencing a couple key anti-manipulation concepts I use to filter information. So, goodbye chill, goodbye any pretense of mitigating my intensity, and say hello, everyone, to one of the strongest tools I have for understanding the social aspect of manipulation.
It’s called the Basement Metaphor.
The story
Suppose you're visiting a friend's house.
The friend wants to be a gracious host, and says they have refreshments in the basement fridge. You volunteer to go downstairs and pick something out yourself. When you walk downstairs and open the door, you cough. It hasn't been cleaned in a while. A fine layer of dust covers many of your friend's possessions, including the fridge where you retrieve your drink. You decide your friend might want to know about this.
“Hey,” you tell your friend, “your basement is getting dusty. You might want to clean it.”
They nod. “Oh, okay. Anything in particular I should look for? I'll clean it after you leave.”
Later that week, you visit a different friend. Again, the friend is storing drinks for guests in their basement, and again, you volunteer to get the drinks yourself. It's just as dusty down there as your other friend's basement.
“Hey,” you say again, “your basement is getting dusty. You might want to clean it.”
To your surprise, the friend blows up. “What?! My basement is clean! I'm a clean person! I don't leave a mess down there! In fact, I never even go down there! How dare you accuse me of being the sort of person to have a dirty basement! You're a terrible friend! I bet your basement is even dirtier!”
So.
At the end of the week, who's more likely to have a clean basement?
What it means
The Basement Metaphor describes how defensiveness holds us back.
Logic is like cleaning
Many people think reasoning is about coming up with an explanation that sounds plausible and then sticking with it. They see being wrong as a sign of weakness or stupidity, and are resistant to taking in new information that does not fit with their belief system.
But in a complicated world like the one we live in, sorting out your beliefs is more like cleaning. You don’t clean something once and then expect it to stay clean forever. When you get exposed to information that doesn’t fit in with your framework of the world, it creates clutter. When you refuse to revisit ideas that are important to you, they get dusty. When you can’t reconcile two contradictory parts of your belief system, it becomes a sticky mess. And there’s always more work to be done, because no one’s house or brain is perfectly spotless.
Another feature of reasoning that makes it like cleaning is, it’s about the result rather than the process. A house isn’t clean because someone spent hours cleaning. It’s clean as defined by the absence of dirt or clutter. A room could have everything in its proper place, and still be dirty because it hasn’t been dusted.
Likewise, logic is about the absence of error in reasoning, as well as consistency with known information. Think of a mathematical proof. It’s only right because every step is right. It’s important to know how the entire process works, just like how you need to know how to clean every part of your house in order to do those tasks, but the goal is to get rid of all the gunk until objective truth remains.
Errors live in blind spots
Dirt and clutter accumulate over time. If nobody’s going through and cleaning up, they stay there. They don’t vanish if you ignore them.
In the same way everyone has chores they don’t like to do, everyone has stuff they don’t like to think about. We learn by asking questions. If people don’t make an effort to ask themselves why they believe what they believe—is this really true? Have I seen anything that doesn’t fit this idea? Am I manipulating myself because there’s an emotional benefit to believing this? Does this make any sense?—those beliefs stagnate. I call these areas, the ones people don’t like revisiting, cognitive blind spots.
Blind spots can involve a certain process or a certain topic. Sometimes people haven’t seen important concepts in formal reasoning, just like how some people never learn to clean an oven. Others are fine applying the techniques they know to solve issues in their job but won’t apply the same rules to puzzle through interpersonal relationships, just like how some people can wipe down kitchen counters fine but refuse to scrub toilets.
If we don’t ask ourselves why we believe what we believe, we can’t articulate where the belief came from; and if we don’t understand where the belief came from, that’s an indication we arrived there through some avenue other than formal reasoning. That makes the belief very likely to be wrong in some way. So if we have a belief we refuse to revisit, on purpose or by accident, it is safer for us to assume it needs work—the same way if we haven’t cleaned something lately, it’s safer for us to assume it’s dirty.
The tragedy of being human is, the questions we like to ask ourselves are the easier ones. It takes less energy to put a book back on the shelf than to deep clean a bathroom. It takes less energy to ask ourselves whether we’re buying the right sheet set than to figure out if we’re inadvertently hurting our loved ones. The more important a belief is to someone’s worldview or self-image—I call those core beliefs—the less likely they are to question it.
And, sadly, yes. That means all core beliefs are probably flawed. That’s where the “basement” part of the metaphor comes in: it’s the stuff at the depths of your personality that not many others get to see.
If you know what to look for, it’s easy to find cognitive blind spots in yourself and in others. To self-monitor, look for the emotion of defensiveness: the feeling you get when you want to believe something is true, and don’t want to have it questioned. To infer details of others’ motivations, look for the absence of information.
Defensiveness indicates a blind spot in yourself
Defensiveness, as an emotion, represents insecurity about a particular belief. It’s an indicator that someone finds it unacceptable to live in a world where they could be wrong about this. Or, that it’s important to them that other people see them a certain way, and if anything challenges that, it will be emotionally difficult. Whatever the reason, they’re not willing to budge, they’re not willing to explain why, and that means they are creating a cognitive blind spot.
This gives you a simple-to-use framework for finding cognitive blind spots in yourself. If you feel yourself getting defensive about a belief you hold, take a minute, and ask yourself—why am I feeling this? Why am I reacting this way?
We often see defensiveness as a negative emotion, or shameful, or even immoral. But when we start to feel defensive, we need to make a note of it. It’s our brain telling us where the clutter and dirt is.
Identifying defensiveness works best for self-monitoring for a couple reasons. One, that you know what your own emotions are, but can only infer information about someone else’s internal state, so it’s easier to work directly from emotions inside your own mind. Other people respond to defensiveness in different ways depending on personality: they could get combative, they could play the victim, they could shut down, etc. That makes it harder to work directly from an emotion because you have to take an extra step of assuming why they’re doing these things.
And two, that it’s easier to start with applying these concepts to yourself and only change the way you relate to others once you’ve mastered that. The only mind you can directly change is your own, and if you can’t convince yourself in a more relaxed setting, it will be nearly impossible to convey this information to someone else in a high-stakes social situation. It can be tough for some, but it’s necessary if this is the path you want to take.
Absence of information indicates a blind spot in others
Now we get out of our own minds and talk about how we communicate our brain wiggles to someone else. Let’s say, for example, that you want to convince your friend you have a clean basement.
The easiest way to do that is to invite your friend down into the basement and show them it’s clean.
If you can’t show them a clean basement, the next best option is to tell them you have a clean basement and hope they believe you.
People play the strongest cards they have. Dictating information about themselves and refusing to let you come to your own conclusions is an admission that the person doesn’t know how to show you a clean basement, and might not even understand how cleaning works at all.
So, to use a classic example, if you’re in an interaction and someone says “I’m a nice guy,” you can reasonably assume this is not a nice person. This is someone who doesn’t know how to behave like a nice person in a convincing way, but who benefits from you believing they are nice. Yes, it’s an assumption and assumptions can be wrong. The point is, you start with that impression based on information you were given and figure out what it would take to reverse it.
Look, it’s a lot but
We’re fancy monkeys on a space rock. We’re not expected to spend the whole day cleaning, or to know every tiny detail before we make decisions.
That’s to say, behaving illogically is not partisan. It’s not something “those bad people over there” are doing. It’s a human thing.
I know the world has a lot to detangle. I know manipulative types like to take advantage of that confusion to promote their own values without people realizing. I know that changing your own mind, let alone the minds of others, is hard. I’ll get to it. Bear with me.
Usage
The Basement Metaphor concept can be widely applied in situations where you have to filter out usable information from bias and social factors. Because there’s a lot going on, we’ll split up suggested usage into three additional essays.
Part 2 of this series will focus on using the Basement Metaphor to become a faster and more effective learner. If you can identify and process feelings of defensiveness, you will gain the ability to analyze your own thinking for cognitive blind spots. We’ll also address iterative learning and how to tell when you’re falling in a Dunning-Kruger trap.
Part 3 of this series will describe how to use the Basement Metaphor for vetting purposes. Basically, whenever someone tells you something about themselves that needs to be shown, assume the opposite. And then don’t say anything. We’ll go over specific examples and clarify which part of rhetorical strategy this concept belongs in.
Lastly, Part 4 will address a common manipulation tactic defensive people use when introduced to this concept. The above practical usage of the Basement Metaphor directs us to make assumptions about someone else’s cognition. Some people might not be comfortable with that. However, if we understand that what we are doing is making an assumption, and we establish what it would take to reverse that assumption, we can become resilient about attempts by bad actors—or ourselves—to confuse us. In short, “we know everything we think we know is wrong. Duh.”
So, this is one of the core concepts that helped me gain trust in my own judgment and become resilient against people encouraging me not to think for myself. I hope you’ll get something out of it too.