Over the past few days, since Trump's election (I can't believe I'm typing this) there has been a lot of talk about Facebook and the bubble effect. In case you don't already know about this, Facebook's news feed algorithm doesn't show you everything that's posted by your friends, people you follow or pages you've liked. Instead, it only shows what it thinks will keep you engaged for more time, so that Facebook makes more money from showing you ads. The Wall Street Journal tried to showcase this in regard to US presidential election, back in May.
This is not new. What's new is how much of what Facebook shows you is false information. Buzzfeed wrote two great posts about it and the Washington Post tried to quantify the problem.
Mark Zuckerberg tried to defend his company by saying he thinks that the idea that fake news on Facebook influenced the election in any way, is "pretty crazy". This is very awkward considering Facebook's main way of making money is by selling ads on the platform. If I were an advertiser, I'd want him to explain me why he thinks they can drive purchase decisions when they can't influence people.
The influence problem becomes even bigger when a lot of people get their news on social media. According to a Pew Research Center study, 44% of Americans get their news on Facebook and 64% of social media news consumers get their news on just one site. When you are one of the biggest news vendors, showing only what people like, without even checking if it's fake is dangerous.
A filtered bubble is something most people desire. Our free time is limited and when we read posts on social media, we want them to be from people we care about, concerning topics we like. That's why we choose who to follow and which pages to like. We curate our own bubble. We don't need Facebook to hide real news and promote fake ones, just to keep us a little longer on the site, to make a bit more money.
It should not be taken as a fact that filtered bubbles helped Trump get elected. A total of 500 newspapers and magazines endorsed Hillary Clinton. Only 26 endorsed Donald Trump. This shows that media have lost a lot of their influence on people. On the other hand, those supporting Trump would feel like this was propaganda. An effort from the corrupt system to keep its power. It is unknown whether or how much Facebook or any other medium influences people and how much an algorithm that chooses what is seen changes that.
What is known though, is that Trump is a fascist, a bigot, a racist, a liar, a misogynist, a sexual predator, an orange 💩. My question is this: as a tech leader and innovator do you want to be the one who helps fight fascism, or try to monetize it?