Can Social Media Stop Misinformation with Media Literacy?
Stopping fire when it starts spreading
I was reading a great interactive article. From growth.design. Which talked about misinformation for the 2020 election. And how Facebook tends to feed the problem. From a design perspective.
We all know that Facebook likes engagement. As it means more people interact with their service. And get to stay on it for longer.
But that’s one of the main reasons why misinformation spreads.
Because misinformation tends to be more engaging than real information. Because of that, the algorithm is more likely to show you something false. Due to the high likelihood of being shared.
When something is highly shared. People are more likely to share it as well. In something called the bandwagon effect.
This reminds me of the content moderation problems. That the tech companies are facing. A lot of work is stopping misinformation before it gets viral.
Lots of people who are experts in this area. Said that most of the damage done is when it starts to pick up steam. Tons of people already viewed the misinformation. And it's hard to delete it. Because people will say the tech companies are overreaching. And may become a story itself. With the Streisand effect.
Tech companies need to work as a circuit breaker. They started to do this in overdrive. As the covid misinformation started to ramp up. So Facebook and YouTube tried their best from stopping covid misinformation from getting out. This was done on the algorithm side.
In the article. On the design side. The article recommended a nice solution. To stop people from blind sharing. Which you get a simple pop up box. Telling you to read the article before sharing. This should let people stop and think. And may stop them from sharing misinformation. Twitter did this for a test. And was able to reduce misinformation on the platform.
Sometimes removing misinformation will require one to make hard decisions. The controversial banning of the former president. Led to a stark decrease in misinformation. By more than 50%.
Misinformation tends to be shared by nodes in a network. So a popular person in the group shared misinformation. Then his fans run with that information. And some of those people will be popular in their own smaller groups. And share the same information. Those fans may share with some friends and family. And that’s how you get your uncle talking about Qanon.
So shutting down a popular node. Is very useful. But can be controversial. So most social media companies opt for shadow banning.
Shadowbanning and it’s disadvantages
YouTube is a great example. With the treatment of borderline content. Which counts Conspiracy Theories, covid denial. Racist videos etc. Youtube simply suppressed those videos. So those videos would not get recommended outside of the audience. This has led to the slow death of these YouTube channels. But has entrenched news incumbents even further. This does not stop misinformation from coming from traditional news channels.
And people who just talk about current affairs in general. Have been hit. Like Philip DeFranco. And other independent YouTubers. And algorithm defaults to showing traditional news channels. Like BBC, CNN, Fox news etc. Because of this YouTube has forced news to have a more establishment bias. Which while more level-headed. Has its biases.
I understand why they did this. As they to get rid of the ranters talking about microchipped aliens. While still providing news on their service. Traditional news networks are known entities. You don’t want to get a PR disaster for recommending a random youtuber providing anti-vax content. The tech companies can’t know all their creators in and out. So, the blanket ban is what they can only do.
But a lot of media literacy can’t just be done by social media companies alone.
It is likely a failure in education.
Social media is only part of the problem
As schools don’t teach kids how to think critically. (NOTE: some problems with critical thinking classes)
But teaching people from a young age about differentiating between different types of media.
Asking questions like:
Knowing if the website is sketchy.
And how to know if an article has any sources backing it up?
But it will be very difficult. In a place like America. Local boards control the curriculum. That’s not bad. But makes it difficult to implement changes like these.
Also lack of incentives for political leaders to back these changes. Do you want a population that can think for itself? And start asking hard questions about your policies. And you’re hiding behind simple slogans. Will become less effective.
I can’t imagine a politician signing up for that.
So while the problem is which deeper and systemic. I think some changes to social media. Can make it act as a firebreak. So it does not fall into violence. Which we saw with the capitol insurrection. If social media can do the job of not making the problem worse. And simply keeping the effects neutral that should be a win.
To recap a lot of changes that social media can do:
Adjust their algorithms.
To avoid recommending extremist content.
And simple design changes that allow people to stop and think before sharing content.