Against the backdrop of hate crimes, racial prejudice and cyber-bullying, Facebook is stepping up its game to curb the trend across its platforms and eliminate claims that it is choosing one group over the other.
On Wednesday, the Silicon Valley giant announced the inauguration of the first 20 members of its Oversight Board. The independent body is empowered to overturn Facebook’s content moderation decisions as they deem fit.
The board will receive cases through a content management system that is linked to Facebook’s platforms. They will deliberate on the report received and decide whether the responsible content will be allowed to stay up on each of Facebook’s sites or not.
Tekedia Mini-MBA edition 16 (Feb 10 – May 3, 2025) opens registrations; register today for early bird discounts.
Tekedia AI in Business Masterclass opens registrations here.
Join Tekedia Capital Syndicate and invest in Africa’s finest startups here.
The 20 member board will oversee appeals from Facebook and Instagram.
Facebook has said in November 2018, that it’s working to establish a board like that to sanitize its platforms from propaganda, fake news and political misinformation. The company was aiming to avoid further controversies like the one that followed the 2016 American presidential election. Russia’s interference in the U.S. election was at the center of the discussion and Facebook was blamed for giving them the platform.
Post 2016 presidential election, the social media giant started to remove posts deemed politically and socially unacceptable, and that didn’t go well with many. Some conservatives groups and lawmakers accused Facebook of censoring politically conservative points of view. Though Facebook denied it, the challenge of instituting a neutral body that will make the decision on what stays up or down was born.
“I have come to believe that we shouldn’t be making so many important decisions on free expression and safety on our own,” Mark Zuckerberg said in 2018.
The goal was to inaugurate an independent body that will lift the burden of content decision making off the shoulders of Facebook’s team.
There were reports then that some Facebook executives had tried to downplay and spin bad news. Coupled with the allegations of some U.S. political groups, the journey of an independent board of decision makers started.
Across the diverse members of the board are lawyers, journalists, human right activists and professionals in digital rights, religious freedom, internet censorship and civil rights. Other members range from former heads of state to Nobel Prize winners to members of Facebook’s team.
Last year, Facebook said it’s giving the board $130 million to cover its operational costs for six years. But there was no detail of how much the board members would be paid.
Among its notable members are Aln Rusbridger, former editor in Chief of the Guardian Newspaper, and Andras Sajo, a former judge and VP of the European Court of Human Rights.
Members of the board believe there is a serious challenge when it comes to content decision making, and Facebook undoubtedly needs help now more than ever.
Helle Thorning-Schmidt, a former Prime Minister of Denmark, and a member of the board’s four co-chairs said Facebook has had to make the most difficult decisions. “Up until now some of the most difficult decisions about contents have been made by Facebook and you could say Mark Zuckerberg,” she said.
Another co-chair of the board, Jamal Greene said it’s time to do something about the challenges of content moderation and Facebook’s approach is novel.
“It’s one thing to complain about content moderation and challenges involved, it’s another thing to actually do something about it. These problems of content moderation really have been with us since the dawn of social media, and this really is a novel approach,” he said.
Michael McConnell, another co-chair of the board said the goal is to make Facebook a neutral platform where everyone could air his view in all fairness.
“It is our ambition and goal that Facebook not decide elections, not be a force for one point of view over another, but the same rules will apply to people of left, right and center,” he said.
The board is expected to begin hearing in coming months and help Facebook avoid accusations of bias by removing controversial contents from the platform.