Facebook “lost” an important policy for three years and only noticed it after the Supervisory Board began to consider the problem, according to are plates. In its decision, the board reviewed Facebook’s internal policies and said the company should be more transparent as to whether other key policies may have been “lost”.
The main case comes from an Instagram post about Abdullah Öcalan, in which the poster “encouraged readers to engage in a conversation about Öcalan’s prison and the inhumane nature of solitary confinement.” (As the committee notes, Öcalan is a founding member of the Kurdistan Workers’ Party, which Facebook has officially labeled a “dangerous organization.”)
Facebook initially removed the post because Facebook users are prohibited from praising or showing support for dangerous organizations or individuals. However, Facebook also had “internal guidelines” – created in part as a result of discussions about Öcalan’s prison – “which allow for the discussion of closure conditions for individuals declared dangerous.” But this rule was not applied, even after the initial complaint of the user. Facebook told the board that it “unintentionally did not transfer” that part of its policy when it switched to the new 2018 review system.
Although Facebook has already admitted the mistake and returned the post, the board said it was “concerned” about the way the case was being handled, and that the “important exception in politics” had effectively failed for three years.
“The board is concerned that Facebook has lost specific guidance on an important policy exemption for three years,” the group wrote. “Facebook’s default policy of removing content that shows” support “for named individuals, although it hid key exceptions from the public, allowed this error to go unnoticed for a long time. Facebook has only learned that this policy does not apply because of a user who has decided to appeal the company’s decision to the Board. ”
The board also punished Facebook for not being transparent about how many other users could have been affected by the same problem. Facebook told the board that it was not “technically feasible” to determine how many other posts might have been removed by mistake. “Facebook’s actions in this case indicate that the company does not respect the right to a legal remedy, violating its corporate human rights policy,” the board said.
The case highlights how Facebook’s complex rules are often shaped by guidelines that users can’t see and how the Supervisory Board has repeatedly challenged the company to make all of its policies clearer to users.
Although only a few cases have been initiated so far, the Supervisory Board has repeatedly criticized Facebook for not following it. . “I can’t just invent new unwritten rules when it suits them,” board co-chair Helle Thorning-Schmidt told reporters after saying Facebook was wrong to impose an “indefinite” suspension on Donald Trump. The committee also criticized Facebook for failing to alert users to key parts of its policies, such as its own She made the company clarify its own policies and how speech is treated and other prominent figures.
Facebook has 30 days to respond to the Supervisory Board in this case, including several recommendations to further clarify its “Dangerous Individuals and Organizations” policy and update its transparency reporting process.
All products recommended by Engadget are selected by our editorial team, regardless of our parent company. Some of our stories include associated links. If you buy something through one of these links, we may earn an associated commission.