-
NEW! Get email alerts when this author publishes a new articleYou will receive email alerts from this author. Manage alert preferences on your profile pageYou will no longer receive email alerts from this author. Manage alert preferences on your profile page
- Website
- RSS
Featured Post
A public letter to Facebook
Facebook, your new plan for users to sign off on trusted news sites will only make the echo chamber much, much worse
Dear Facebook,
Recently, you announced you’d be changing the news feed to bridge people together, to burst the bubbles everyone’s been saying that you have either created or deepened. You notified us that we would start seeing more friends and family, and less “Pages.” Some people have gotten mad at you because they think you’re trying to make more money off of advertisements and such – perhaps correctly – but I actually think this is a great pretense to work under, and a great goal to work towards.
Yesterday, you announced your second major update, this time in blatant reference to correcting “fake news.” You notified us that you would help us ensure that the news on Facebook only comes from trusted sources, and you humbly allowed us, the readers, to decide for ourselves whom we trust and whom we don’t – who provides real news and who provides fake news.
“There’s too much sensationalism, misinformation and polarization in the world today. Social media enables people to spread information faster than ever before, and if we don’t specifically tackle these problems, then we end up amplifying them,” Zuckerberg wrote yesterday, and I partly agree. This is definitely what is happening in the world, and social media amplifies voices we have and have not heard. This could be a problem that needs to be tackled, but it could also be an opportunity that needs to be seized if we want to grow into the next stage of the way we can see and act in the world.
What we’ve learned in the past couple of years at 0202 — Points of View from Jerusalem is that while one person sees X as real and Y as fake, another – very often, his neighbor — sees Y as real and X as fake. By deciding “that having the community determine which sources are broadly trusted would be most objective,” by asking people how much they trust a certain news source, you’ll never be able to break the bubble of fake news. You’ll actually be making it worse.
You see, we live in a world full of people who are sure they’re right, and everyone else is wrong. This means that when I read a news source I trust, I believe wholeheartedly that my news source checked everything and these are the facts. But when my neighbor — a Palestinian woman living two kilometers away from me, with the same municipal voting rights as I — wakes up in the morning and reads her news sources, she might be reading the complete opposite of what I read, and she’ll believe it just as wholeheartedly as I believe mine. So you’ll ask both of us which news sources are credible and which aren’t, and you’ll get exact opposite answers. What will you do with this information?
What will happen, as I see it, is that you’ll make the hegemony-minority rift bigger, because only news trusted by the majority will appear on Facebook, and minority groups will be further pushed to the sidelines. Because Facebook has such a huge influence on public discourse, this means minority issues will slowly dissolve from the public sphere and we’ll be on the dangerous path to fascism, to blindness, to the disruption of what Facebook and the Internet have positively created: a multicultural, smaller, more complex, more interesting world. You will wipe out any new ideas that might come out of the minorities, the radicals, the creative – because you will destroy any idea that isn’t popular. And you know who in our history weren’t popular, but advanced us because they were actually brilliant? Winston Churchill. Rosa Parks. John Adams. Harvey Milk. And really, every single person behind every single reformation for human rights, every single person on the path to creating a more equal and just world in our entire history, was alone in thought in the beginning. Every single one. And you’re on the path to wiping them out. You’re creating a platform that eerily resembles Orwell’s 1984 world.
There’s actually an even deeper issue at hand. People act based on the news they believe in. I’ll give you a pretty extreme, but very real, example. When my Arab neighbor – remember, we live in the same city here in Jerusalem, we should be seeing the same news, we should be agreeing on the same objective facts, but we don’t – when that neighbor reads in his trustworthy news source that his neighbor got shot because she tried to pull out a cellphone from her bag and was mistaken for attempting to stab an Israeli civilian, he’ll get angry. Because he’ll believe this is the truth. And 2 km away from him, I’ll read that Israeli forces neutralized a terrorist in an attempted attack. And we’ll continue on in our daily lives, he believing that he can get shot down for a regular human action like pulling out his cell phone, and I believing that at any given moment, my Arab neighbor wants to stab me. This is a pretty big problem, especially in the turmoil of the Jerusalem reality. In a world with your proposed solution to the problem, you’ll wipe out one of our media sources and keep the other. You’ll choose my worldview or his worldview as the correct one, and leave no room for understanding.
When you pilot this new idea this week, please keep in mind that no news is 100% real, and all news is partially fake. Keep in mind that each news source has its followers, and its followers remain loyal. Each news source has its haters, and its haters remain hateful. Keep in mind that these new algorithms won’t change these patterns, they’ll only deepen them. They’ll only keep each community more internal, more self-justified, more angry at globalization, more de-legitimized and de-legitimizing of “the other.”
Instead of deciding based on surveys and majority votes who is trustworthy and who is not, instead of walking down the path to give more power to the powerful and take away the little power the disempowered have, I propose we start listening to each other. I propose that your algorithms will give us more of what we don’t want to see, as opposed to more of what we do want to see. Instead of wiping out these eclectic voices, which will make those voices stand stronger in their own community against the closing in of the hegemony, I propose we make those voices stronger. I propose we start to see what the other sides see, not in order to agree with them, but in order to understand the complexity of this stage of the times in which we live. Instead of creating a hegemony of news, I suggest we start seeing the array of voices that make up our world, encouraging us to talk to each other, to understand why my neighbor sees X as true and I see it as fake. I suggest we start to see that fake news and real news are not the issue. The issue is only seeing one type of news. And you, Facebook, are going to make it worse.
My hope, like yours, Mark, is that Facebook will become a real marketplace of ideas and cross-cultural, cross-political interactions where – in your words – “we’re strengthening our relationships, engaging in active conversations rather than passive consumption,” and seeing what our neighbors see and think, as opposed to de-legitimizing them in the first place, making the walls between us higher.
Dear Facebook, please be careful how you work towards what you wish for. In my humble opinion, your actions will create the opposite of your proposed intentions.
Related Topics