Facebook said its Artificial Intelligence and Machine Learning software have enabled the social media giant to remove nearly 9 million sexual images of children in the past three months.Almost of these images, 99%, were removed before any user reported them, Facebook said
in a statement.
The tech giant said in a statement
that its moderators removed 8.7 million child abuse images in the past three months.Facebook's so-called 'Community Standards' ban child exploitation and to avoid the potential for abuse, it takes action on non-sexual content as well, like seemingly benign photos of semi-naked children or children in a bath. "We also remove accounts that promote this type of content," Facebook's Global Head of Safety Antigone Davis clarified.
AI-driven software automatically picks up images that contain both nudity and a minor, the company explained.