Meta oversight board calls for overhaul of nude photo standards

A row over photos of bare torsos with covered nipples posted on Instagram by a transgender and non-binary couple sparked a call for the platform to clarify its content guidelines.

Content creators have long criticized Facebook and Instagram for their content moderation policies relating to photos showing partial nudity, arguing that their practices are inconsistent and often biased against women and L.G.B.T.Q. people.

This week, the supervisory board of Meta, the platform's parent company, urged him to clarify his guidelines on these photos after Instagram removed two posts depicting non-binary images and shirtless transgender people.

The posts were quickly reinstated after the couple appealed, and the board Meta watchdog reversed the original decision to remove them. This was the council's first case directly involving gender non-conforming users.

"The restrictions and exceptions to the rules on female nipples are extensive and confusing, particularly as they apply to transgender and non-binary people," Meta's oversight board said in its case summary on Tuesday. "The inherent lack of clarity in this policy creates uncertainty for users and reviewers , and makes it impractical."

The problem arose when a transgender, non-binary couple posted photos in 2021 and 2022 of their breasts naked with their nipples covered. The captions included details of a fundraiser for one member of the couple to undergo top surgery, a gender affirmation procedure to flatten a person's chest. Instagram deleted the pictures after that res users reported them, claiming that their depiction of breasts violated the site's Sexual Solicitation Community Standard. The couple appealed the decision and the photos were later reinstated.

The couple's back and forth with Instagram highlighted criticism that the platform's guidelines -form for adult content is unclear. According to its community guidelines, Instagram prohibits nude photos but makes some exceptions for a range of content types, including mental health awareness posts, depictions of breastfeeding and other "health-related situations". - settings that Meta's board described as "convoluted and ill-defined". in its summary.

How to decide which depictions of people's breasts should be allowed on social media platforms has long been a source of debate. Dozens of artists and activists argue that there is a double standard that women's breast posts are more likely to be removed than men's. This is also the case for transgender and non-binary people, advocates say.

The Meta Oversight Board, a body of 22 academics, journalists and human rights, is funded by Meta but operates independently of the company and makes decisions that are binding on it. The group recommended that platforms further clarify the community norm on adult nudity and sexual activity, "so that all people are treated in a manner consistent with international human rights standards, without discrimination on the basis of sex or gender".

< p class="css-at9mc1 evys1bk0">He also called for "a comprehensive assessment of the human rights impact of such a change, involving various stakeholders and creating a plan to address any harm identified."

Meta has 60 days to review the summary from the Board of Supervisors and a company spokesperson said she would publicly respond to each of the council's recommendations by mid-March.

Meta oversight board calls for overhaul of nude photo standards

A row over photos of bare torsos with covered nipples posted on Instagram by a transgender and non-binary couple sparked a call for the platform to clarify its content guidelines.

Content creators have long criticized Facebook and Instagram for their content moderation policies relating to photos showing partial nudity, arguing that their practices are inconsistent and often biased against women and L.G.B.T.Q. people.

This week, the supervisory board of Meta, the platform's parent company, urged him to clarify his guidelines on these photos after Instagram removed two posts depicting non-binary images and shirtless transgender people.

The posts were quickly reinstated after the couple appealed, and the board Meta watchdog reversed the original decision to remove them. This was the council's first case directly involving gender non-conforming users.

"The restrictions and exceptions to the rules on female nipples are extensive and confusing, particularly as they apply to transgender and non-binary people," Meta's oversight board said in its case summary on Tuesday. "The inherent lack of clarity in this policy creates uncertainty for users and reviewers , and makes it impractical."

The problem arose when a transgender, non-binary couple posted photos in 2021 and 2022 of their breasts naked with their nipples covered. The captions included details of a fundraiser for one member of the couple to undergo top surgery, a gender affirmation procedure to flatten a person's chest. Instagram deleted the pictures after that res users reported them, claiming that their depiction of breasts violated the site's Sexual Solicitation Community Standard. The couple appealed the decision and the photos were later reinstated.

The couple's back and forth with Instagram highlighted criticism that the platform's guidelines -form for adult content is unclear. According to its community guidelines, Instagram prohibits nude photos but makes some exceptions for a range of content types, including mental health awareness posts, depictions of breastfeeding and other "health-related situations". - settings that Meta's board described as "convoluted and ill-defined". in its summary.

How to decide which depictions of people's breasts should be allowed on social media platforms has long been a source of debate. Dozens of artists and activists argue that there is a double standard that women's breast posts are more likely to be removed than men's. This is also the case for transgender and non-binary people, advocates say.

The Meta Oversight Board, a body of 22 academics, journalists and human rights, is funded by Meta but operates independently of the company and makes decisions that are binding on it. The group recommended that platforms further clarify the community norm on adult nudity and sexual activity, "so that all people are treated in a manner consistent with international human rights standards, without discrimination on the basis of sex or gender".

< p class="css-at9mc1 evys1bk0">He also called for "a comprehensive assessment of the human rights impact of such a change, involving various stakeholders and creating a plan to address any harm identified."

Meta has 60 days to review the summary from the Board of Supervisors and a company spokesperson said she would publicly respond to each of the council's recommendations by mid-March.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow