The US-based social media giant Facebook is tightening its measures to battle against child abuse by implementing new tools for spotting such content and taking strict actions about what crosses the line.
Facebook has updated its guidelines to make it clear it will remove Facebook or Instagram accounts dedicated to sharing images of children posted along with captions, hashtags or comments containing innuendo or inappropriate signs of affection.
“Using our apps to harm children is abhorrent and unacceptable. We are developing targeted solutions, including new tools and policies to reduce the sharing of this type of content. We’ve always removed content that explicitly sexualizes children, but content that isn’t explicit and doesn’t depict child nudity is harder to define.”
Sometimes the images alone may not break Facebook’s rules, under the new policy, the social media company will look into the accompanying text to better determine whether the content is sexualizing children and decide if the associated profile, page, group or account should be removed.
New tools being tested included one that triggers pop-up messages in response to search terms associated with child exploitation, warning of the consequences of viewing such material and suggesting people get help changing the behavior.
Facebook’s safety alert intends to inform those people sharing child exploitation content about the harm it causes and the legal consequences. Along with removing content violating the platform policies, such posts are reported to the National Center for Missing and Exploited Children (NCMEC).
“We are using insights from this safety alert to help us identify behavioral signals of those who might be at risk of sharing this material,” Ms. Davis said.
According to Facebook’s analysis of illegal child exploitive posts shared with the NCMEC late last year it was found that more than 90 percent of the post was the same or very similar to previously reported content. Just six videos accounted for more than half the content reported in that period.
Facebook worked with the NCMEC and other groups to gather the apparent intent of people sharing such content. It was concluded that over 75 percent of the sharing scrutinized did not appear to be malicious, but was done for reasons such as expressing outrage or in poor attempts at humor, according to Davis.
Facebook has sparked concerns among law enforcement agencies with its plans to provide end-to-end encryption at all its messaging platforms in a move that police say could let criminals hide communications.