If a thousand people “Like” this post, Mark Zuckerberg will spare this kitten. Please share!
Chances are you have a friend or relative who is constantly posting dubious “facts” on social media: that you can charge your electronic devices by plugging them into an onion, that entering your PIN in reverse at an ATM summons the police, or that this year Halloween falls on a Friday the 13th for the first time in 666 years. (Think about that one for a moment.)
We can laugh all day at our paranoid friends and computer-illiterate aunts for falling for Facebook hoaxes, but the basic offense here — passing along information without any attempt to verify it — is something most of us probably do all the time.
Bad information isn’t always obvious, and it probably wouldn’t occur to you to investigate a claim unless it sounded untrue to you from the beginning. There’s pretty good evidence that we’re much more gullible than we think: we tend to believe what we hear, unless it initially strikes us as unlikely.
After a belief passes the front door, it usually doesn’t get much scrutiny. It becomes part of your “body of knowledge,” which is just another name for your impression of the way the world is, and it remains there until some new belief utterly clashes with it and you’re forced to reconsider. We easily forget our reasons (if we ever had any) for believing what we believe, and we’re seldom asked for them.
Don’t take my word for it, but you can be almost certain that a lot of the things you “know” aren’t really true. I would bet money that some of the facts of life you currently feel certain about can be found on this list of common misconceptions. It may surprise you to hear for example, that sugar doesn’t cause hyperactivity in children, and that fortune cookies are actually Danish in origin, not Chinese or Japanese or American.
We do learn quite a bit about the world from direct experience. But clearly, most of our learning amounts to believing the beliefs of other people, whether they’re expressed in a Facebook post or in a textbook. You hear or read something, and if it seems true you’ll probably believe it. In all likelihood you’ll never try to verify that belief unless someone else challenges it, and it may never occur to you that it might be wrong. Once a belief has established itself, we freely tell others what we know, or think we know, and the process repeats.
This indiscriminate passing-on of totally unverified information is a bad habit we human beings have always had. We seem to be more interested in making impressions on others than really knowing what’s true. The primary reason we had witch hunts for centuries is the same reason there are people (in 2014!) trying to reduce their belly fat by doing situps. Without self-doubt and respect for evidence, you can spend endless effort building abs you will never see, and spend three centuries fighting evil spirits that were never there.
Of course, we can’t verify everything we believe. And even when we do check, how do we know that the second opinion is correct? The sources could be wrong. So we have to check the source’s sources. The problem is that as you follow these rabbit holes, they split in too many directions. Following each one could potentially take forever, and there’s a creeping suspicion that becoming 99.99% certain of this one single belief isn’t worth the work it takes.
For example, if a friend or social media acquaintance tells you, “Sugar is poisonous! Didn’t you know that?!” and cites Dr So-and-So’s book as a source, do I need to read that book if I want to know if that’s true?
If I do read it, for the book to verify anything to me, I need to consult its sources, which probably amount to a pile of other books and some studies. Then I need to not only read these books and studies, but to understand them, and the efficacy of their methodologies, and by this point I’m hundreds of hours in the hole and way out of my depth. And I’ve only just begun, because it’s only fair to repeat this process with the books of each of Dr So-and-So’s detractors, and even then there’s no guarantee that any of them are right.
Of course, I’m not going to do this, but I am going to dig a bit deeper than the surface, and that much is worthwhile. Following rabbit holes until you hit bedrock isn’t necessary for getting drastically smarter about what you believe. Things can get a hell of a lot clearer after the first few minutes of Googling: Oh, Dr. So-and-So is actually a chiropractor, not a scientist. Oh, Dr. So-and-so sells “fat-burning” nutritional supplements. Oh, there’s a warrant out for his arrest.
All of these are unverified claims too, and none of them are absolute proof that Dr. So-and-So is incorrect, but my digging has definitely given me a good reason not to tell other people that sugar is poisonous, at least for now.
Adapting to The Age of Bullshit
You’d think having the internet at our fingertips would make us more sure about what we’re talking about. But the problem is that human beings are just not into the habit of verifying things. So instead of using the internet’s incredible power to verify what we hear, we use its incredible power to absorb more unverified information than ever, and pass it along at a greater rate than ever.
Karl Taro Greenfield wrote a great article in the New York Times explaining how quick we can be to talk out of our asses in the internet era. He argues that because we live in the Information Age, we’re expected to have heard of everything already, and so we feel increased pressure to avoid admitting when we don’t know what we’re talking about:
Recently I was on the phone with an editor who mentioned a piece by a prominent author. I claimed I had read the story. It was only later in the conversation that it became clear to me that the article had not yet been published and I could not possibly have read it.
When you have something printed in a major publication, they ask you to guarantee, among other things, that it’s free of erroneous statements of fact (or they’ll do it themselves.) To do that, you have to print it out, and underline every truth claim you make, and either verify it, eliminate it, or modify it so that it’s true (instead of saying “____ is ____” you might say, “I suspect that ____ is ____.”)
This is a humbling process. The first time you do it, you realize how often you’re poised to say something that you really don’t know is true. This step is left out, however, when we’re just chatting or Facebooking. In our day-to-day life we’re liable to be a lot more flippant about the truth than when a writer is preparing something for publication.
It would be sobering to have a transcript of everything you said on a given day, with every statement of fact underlined in red, then to have to spend a few minutes checking on each one to see if it’s true, or even likely to be true. I bet we’d all be shocked at how casually we make unqualified declarations about the world and the people living in it.
I’m calling myself out on this as much as anyone here. I don’t fact-check everything I say, on this blog and in person, and I don’t think it’s really feasible. But I am getting better at noticing that moment when I’m about to impart my “knowledge” to someone else, and qualifying it with, “My impression is…” or “I believe…” even if I feel fairly certain.
Doing this completely changes the kind of statement you’re making. Instead of making a claim about the way things really are, you’re just making a claim about your thoughts on the matter. You’re reminding yourself and the person hearing you that there is some degree of uncertainty (even if you don’t feel it) in almost everything you say.
That “I just know I’m right” feeling is not a reliable indicator that you’re right. In fact, it’s a good reminder to ask yourself how much digging you’ve really done on the question.
And that’s what I propose for making the Internet a little bit freer of nonsense: do a little bit of digging, as a habit, before you pass something on. I’m hoping this little bit of diligence becomes a normal thing to expect of each other, given that we live in an age where it’s easier than ever to both spread bullshit and to dispel it.
This diligence is especially important when you learn something that sounds particularly appealing to you — when you notice that you want it to be true. That’s a strong indication that you’re in danger of fooling yourself, and others.
Think of that feeling as an X that marks the spot where you ought to start digging, if only a little bit.