Meta oversight board examining company’s response to deepfake of Indian actress on Instagram

AI technology advancements have heightened concerns about sexually explicit fakes, often targeting women and girls. These are becoming increasingly difficult to distinguish from authentic content

Meta

Meta Platforms’ independent Oversight Board is currently examining the company’s response to two AI-generated sexually explicit images of female celebrities shared on Facebook and Instagram. These images, which were not named to avoid causing further harm, serve as case studies for the board to evaluate Meta’s policies and enforcement strategies regarding pornographic deepfakes.

AI technology advancements have heightened concerns about sexually explicit fakes, often targeting women and girls. These are becoming increasingly difficult to distinguish from authentic content. The issue gained prominent attention when the Elon Musk-owned platform X blocked all images of Taylor Swift following a surge in fake explicit content of the pop star. In India, multiple actresses, actors and even sportsmen have fallen victim to deepfakes.

Key points from the Oversight Board’s current review include:

Nature of the Images: One image shared on Instagram depicted a nude woman resembling a public figure from India. The other, posted in a Facebook group, showed a nude woman resembling an American public figure in a sexually compromising pose.

Meta’s Actions: The image of the American woman was removed for violating Meta’s harassment policy, while the image of the Indian woman remained until the board intervened.

Source: https://www.businesstoday.in/technology/news/story/meta-oversight-board-examining-companys-response-to-deepfake-of-indian-actress-on-instagram-425819-2024-04-17

Exit mobile version