Facebook which is doubling as a much sought after news medium, this time for popularizing news that was classified fake to the bone! On August 31st, the news feed ran a false story stating that anchor Megyn Kelly was fired from Fox News for being liberally supportive of Hillary Clinton in her political opinions. This news gained traction quicker than a forest fire and started raising fate altering discussions. Realizing their folly, Facebook removed the story, apologized and promised to straighten up their system.
However, this section seems to have a lingering trail of bugs that are yet to be hunted out.
The experiment that went off-track
The news team at Facebook holds an important post which involves filtering out news that is fact-checked, reported and legitimate and posting them. In an initiative to making its algorithms its mainstay, Facebook fired its team of editors and deferred its algorithms in order to post what the algorithm search provided. Things went downhill once fake news was made trending.
Checking into the problem, within a period spanning August 31st till September 22nd, it was found that Facebook boosted five news stories which had questionable content or baseless facts. It was also found that irrelevant links that lead to retail purchase sites were embedded in the text.
One such was on the release event of iPhone 7, where the post quoted a breathless account of iPhone’s magical features from fakingnews.firstpost.com. The common output from all such posts was that it garnered a lot of views, comments and shares with an equally impacting political response.
Repercussions and regrets:
Facebook, holding the first medium for news amongst scores of people, should have its algorithms on point and news, accurate. It is also in a sensitive situation where, once a mistake is made, it cannot be totally wiped out off the net for the fact that it would have been viewed and shared by countless people.
Facebook’s trending stories section should serve as a dashboard that displays that day’s important and most discussed news, the picks being monitored by the editor team. Ideally, a human examines and treats the algorithm-picked content to suit public viewing and posts it.
Last August, Facebook made a bad move by removing the editor team and completely automating the process. This proved to be a bad move considering the fact that explicit and objectionable content cannot be discriminated even after adding an endless list of keywords to ban them.
Defying the sixth sense
Human editors have their own opinions and that invariably surfaces while viewing topics against their opinions. To alleviate this, Facebook installed algorithms as a replacement. This however put the veracity of the news under the gavel thus shaking the very base of the purpose.
This questions the intelligence of machines and how sixth sense, even though biased, is the best in identifying the authenticity of source which is very vital.
Google, in an initiative to refine this algorithm in its own news portal Google News, has taken the first step towards rectifying the flaws.
Facebook should realize that some actions cannot be replaced with machine intelligence and this sham teaches this lesson. Human auditing is essential to understand the sentiments and repercussions that a particular misinterpretation can trigger.
Collaborator: Shruti Balakrishnan