With all the post election hubbub, fake news is getting a bit of discussion. There are claims going around that the outcome of the election is a result of:
- The 'filter bubble' you get trapped in on social media, specifically Facebook, by hiding all the things you don't agree with, and;
- The virality of factually incorrect but belief confirming articles which get shared around by people who want them to be true.
I've actually had most of this post drafted for a while, but with all the above context, now seemed like the right time to finish it. I use a simple 'trick' to be wrong less often - although calling it a trick is probably too generous; personally I feel it's just common sense. However, as the saying goes common sense doesn't seem to be that common. The trick? I simply take the time - usually only 30 seconds or so - to check the facts before I believe something. That's it.
II. Check Yourself...
A few weeks ago I came across this:
Nice idea, but literally the first result on Google for 'headrest emergency' is this article by snopes which explains that while it might be a convenient second use, no, they were not designed for this.
Snopes is a great site for this - their whole schtick is investigating internet myths and marking them as true or not. Kind of like a modern day internet mythbusters. If the first few results on Google don't make the truth clear, whacking 'snopes' on the end of your query usually clears it up.
Just this morning, this popped up in my Facebook feed:
Why yes Kurt, it does sound a bit crazy. Crazy that Cobain actually said this, anyway. It seemed a bit fishy that this could possibly be a real quote, and sure enough - snopes to the rescue.
Before sharing that crazy fact or unlikely quote - check yourself. Don't just blindly swallow everything that lands in front of you. It may seem harmless for small things like this but every little "fact" you accept as truth is pushing the mental venn diagram of 'beliefs' and 'reality' just a tiny bit further apart.
III. Seeking the Devils Advocate
There is also a kind of 'level two' to this habit which goes beyond the simple 30 second fact check, which for lack of a better term I call 'seeking the devils advocate.' When it isn't so simple to check if something is true or false, either because it's a long or complex train of thought that doesn't necessarily lend itself to googling, or because the subject matter is too complex for the 'truth' to be readily available, you need to go deeper.
Hacker News is a tech-related news website with comment sections on many posts. The discussion in these comments often adds context that completely changes my interpretation.
For example, this guide on building muscle was linked on there. For those without much nutrition/workout knowledge it might appear to be a good guide, however the comment section calls out a few relevant points:
- The author's experience consists of "I wrote the first draft of this guide months ago. I meant to publish this then. But I unexpectedly lost half the muscle I had gained." and so his opinion on the area of bodybuilding should be taken with a grain of salt
- There are factual errors such as "Women gain muscle at the same speed as men," something that isn't true thanks to the biological differences caused by testosterone.
Nothing in that article stands out as something you have to fact check, but as a whole it's juuust off the mark in subtle ways.
Another example is an article that gives the advice (regarding salary) that you should never accept a counter offer from your existing company when you are about to leave. The article makes some good points in defense of this, they are mostly opinions rather than straight out facts and you'd be hard pressed to 'disprove' the article. However the comments provide some context that to me was just as important as the original article:
Recruiter here, almost 20 years. Do a search for articles warning about counteroffers and you'll find almost all are written by agency recruiters. Why is that?
It's because agency recruiters lose probably millions in fees every year when candidates accept a counter... Some counteroffers are a mistake, but the people who scream not to accept counteroffers the loudest are the ones with the most to lose.
Again, interesting article, but useful to have both perspectives. Neither is obviously more correct, but just by being aware that there are two perspectives, you are a long way towards figuring out which one is the truth.
Reddit is another good discussion site that often has great 'devils advocate' commentary.
Take this article on child obesity, linking it to poor sleep and skipping breakfast. Taken on the surface I might've read that and thought "Yeah, I guess those things upset your metabolism." The first comment on the reddit page offers an alternate, more likely explanation:
Both these aspects, missing sleep and not eating breakfast, generally correlate to an unstable home setting, which generally can be related to poverty.
That is, missing sleep and breakfast are correlated with obesity but don't cause it, as the article suggests.
I like to think I'm pretty good at remembering 'Correlation does not equal causation' - it was drilled into me over the seven years I spent studying towards my statistics degree - but it can be very easy to forget if you're lazily browsing the internet and not in the middle of a statistics exam.
"Correlation does not imply causation, but it does waggle its eyebrows suggestively and gesture furtively while mouthing "look over there."" - xkcd
Then there are books. I feel like there is an automatic extra level of psychological 'authenticity' with books over news articles, kind of like how a business card makes your business seem more legit than if you don't have one. It makes a certain sort of sense, in that there is certainly extra editing and a lengthier, more expensive process when publishing a book vs an article on a website. But that doesn't necessarily mean you should take it at face value.
I read Flash Boys not too long ago, which on the surface tries to explain high frequency trading and how HFT firms are screwing over all the other firms. It's a great story, Michael Lewis knows how to spin a decent yarn, but there were a few holes in the logic. Then I read 'Flash Boys: Not so fast' which was by someone who worked in the HFT industry and went chapter by chapter over the first book, explaining the math and how the system actually works, until you could see those holes were really huge stinking pits.
IV. ...Before You Wreck Yourself
I would be doing a disservice to talk about this without mentioning the fact that most people simply don't care if they're wrong. I used to get confused and even a little bit angry that people would be so naive and thoughtless, but now I think I understand at least some of the factors at play.
Part of it is selection bias - those people that do fact check themselves don't end up posting anything. Imagine 100 people stumble across the 'headrest' factoid above, but 90 of them google it and see it's false. Those people don't end up posting anything, whereas the other ten do. From your perspective, all you see is ten people sharing the same incorrect fact, making it seem like more of a problem than it really is.
Part of it is perverse incentives at play. My favourite example of this is the famous story about cobra's in Delhi, India. The city at the time was suffering from a cobra problem, so the British colonial ruler made a decree that he would pay out a bounty on each dead cobra. Initially this was successful, but then some clever people got the idea that they could breed cobras, farming them just for the reward. When they realised this was happening, they cancelled the bounty program and so all the people who had been farming them set their now-useless cobras free, making the problem far worse than it ever had been in the first place. This is a pretty extreme example but it illustrates how sometimes incentive structures can actually make things worse. I think there is an element of this to the fake news problem - people are rewarded on Facebook in the form of likes and shares (or sometimes money) which means that they are incentivised to post the things that get the biggest reaction. If the reward is bigger than the risk of punishment - which it is, because nobody bothers to fact check - then you end up in a situation where the boring truth gets sidelined in favour of interesting falsehoods.
Finally, there's also a distinction between two types of belief; those that are there to be used and those that are there for other purposes. Read the link in the last sentence if you want a full explanation but in short, there are some beliefs - such as 'I believe the train leaves at 3pm' - that are there to serve a purpose, these beliefs are for practical use and so you want them to be as accurate as possible. You do care if this belief is wrong because it means you miss your bus, and you don't get defensive when someone corrects you; you say thanks. Then there are crony beliefs, which aren't for practical purposes. The name comes from the fact that these beliefs aren't there to be productive workers, they are 'cronies,' there for political and social kickbacks. Believing in the 9/11 conspiracy won't change the way you live your daily life, but it does provide social kickbacks in the form of automatic respect from other believers of the crony belief. Similarly the social benefits of believing certain political views far outweighs any pragmatic benefits, and similarly similarly the warm fuzzies we get when people like and share our Facebook posts far outweighs the need for truth in a lot of situations.
I question whether I should comment on such statuses and point out obvious falsehoods. I'm not making any friends from doing it. More and more lately I lean towards not questioning people to their face - smile, nod, look it up in private. This is, I think, the correct solution when either a) the misinformation is harmless, or b) saying something isn't going to change anyone's mind. Sure, anti vaxxers are harmful, but someone who believes it isn't likely to have their mind changed by a random stranger on the internet.
Check yourself - always seek the devils advocate for any new information you take on board, and you're doing better than 90% of the population. Educate others that are open to being educated. Just don't waste your time being a keyboard warrior.
Two weeks after this was originally posted Facebook has now announced tools to prevent fake news.
Feels weird to have the wind taken out of my sails about this issue so quickly but I'm glad! It looks like some interesting first steps to combat fake news. The TLDR is that they have made a tool for users to click on the corner and report an article as fake news. This then goes to one of four 'vetting' companies, one of which is Snopes, who I recommended in the original blog post. If an article is reviewed and deemed as fake news, it gets a big red 'fake news' label and is given a penalty in their newsfeed ranking algorithm.
This tool seems like a great start to combating internet misinformation and I'm looking forward to seeing if it's successful.
Silly that I thought facebook would make any changes that would meaningfully change anything. This issue still exists today and it will continue to exist as long as people have the ability to talk to each other. Passing on things we've heard elsewhere without verifying them (or, gossip) is as old as humanity.
Wow typing that made me feel old. It doesn't feel like that long ago since mythbusters was the new hot tv show. ↩︎
This effect is something called 'Domain Specificity'; if you learn something in a certain context or domain it is easier to recall when you are in that domain (and conversely harder to recall when you aren't). This is part of why your teachers will always advise you to study under exam conditions in a quiet environment. ↩︎
Of 'The Big Short' and 'Moneyball' ↩︎