One causes cancer, while the other could be just as addictive, going by some of the technology industry’s leaders this week.
Clearly, Facebook hasn’t had the best of times of late. Fresh from being used by Russian agents to swing the United States presidential election in 2016, it has had to come up with measures to root out fake news.
It’s a task the social media network wasn’t set up to do. Meant to connect people and make stuff go viral, it can’t just build the safeguards and fact checks overnight to root out fake accounts, trolls and other elements out to damage democracy.
While it has tried to ban ads on fake news sites of late, after pressure from governments around the world, the truth is it has grown too big.
With more than 2 billion active users, Facebook will find it an unending battle to root out users who spread falsehood and incite violence through hate speech.
No sooner is a system set up will it be gamed to death by those who seek to overcome it. Just consider its latest attempt to reduce what you see from news publishers, instead prioritising what your friends share instead.
There’s really nothing stopping these users from sharing more of the same hateful rhetoric that has driven people to extremes in the US, for example.
Last month, Facebook had said it was working with third-party checkers, such as Snopes, ABC News and Politifact, to sort out the fake stuff. That was a good thing.
Then last week, Facebook’s Mark Zuckerberg said users themselves would determine how news outlets rank in terms of trustworthiness. That’s bad news.
In the same way that false reports are spread by the masses, this is a recipe for exploitation by those who have already shown how they can work the system.
Wikipedia, the open source of information, is a great example. It has to regularly endure people looking to edit fact from fiction.
Singapore saw this play out briefly when the ethnicity of President Halimah Yacob was being talked up during the run-up to last year’s election.
A Wikipedia page was edited and re-edited, with each person trying to bring her race (she has an Indian father and Malay mother) into focus.
To many Singaporeans, race does not define one’s qualifications (a Yahoo poll in 2016 placed Deputy Prime Minister Tharman Shanmugaratnam, an Indian, as a favourite to lead the country, though he later said he did not want the job).
Yet, the presidential election which Halimah won through a walkover, because it was reserved for Malays and did not attract an eligible contestant, brought out angry reactions on social media.
Shortly after she was elected as the first woman president here, she had detractors saying she was not their president, because they never got to vote.
The People’s Action Party, which backed her, also said it was going to pay a political price for pushing through the first such election reserved for a minority race.
What has this got to with Facebook then? Clearly, there are limits to what a social media network can control, given the charged-up political atmosphere that it has created through echo chambers online.
Though it wants to moderate some of the extreme elements on the network, it is rather late now to set new rules after the game has kicked off. People are already deeply divided. They want to believe what they want to believe.
To be fair, what it’s doing now is better than nothing. It sure beats simply collecting ad dollars from these peddlers of falsehood.
But will Facebook engage in a long-term battle against errant members? Stay ahead of the new ways that they will use to game the system in future?
Unless it wants to become a censor of sorts, it will find that task a tough one. In creating this viral network, it has built an online world it just cannot keep up with.
Perhaps it’s time the audience wised up. Users have to find the facts for themselves for a democracy to work. Just like smoking, you have to stop picking up the cigarette one day. Warnings and labels can only do so much.