Topic: Facebook tests tools to combat child sexual abuse

Facebook is testing a เกมสล็อตฟรี pop-up notification that appears when users try to find content linked to child exploitation. And an alert warns users who attempt to share the content.Facebook is testing a new tool aimed at controlling the search for images and videos that contain child sex abuse and to prevent the sharing of such content.The use of our apps to harm children is very offensive and unacceptable," Antigone Davis, who oversees Facebook's global safety efforts, said in a blog post on Tuesday.
The move comes as social networks face more pressure to combat the issue amid plans to enable default encryption for Facebook Messenger messages and Instagram, Facebook's photo service end-to encryption. end means that, with the exception of the sender and recipient of the message, it cannot be viewed by anyone,

including Facebook and law enforcement. Child safety advocates have raised concerns that Facebook's encryption scheme could make it harder to crack down on child predators.The first tool Facebook is testing is a pop-up notification that appears if a user searches for terms related to child sexual abuse. The announcement will ask the user if they want to proceed and will provide a link to the offender's distracting organization. The announcement also states that child sex abuse is illegal and viewing these images could lead to consequences, including imprisonment.nrp-child-safety-bundle-announce -inline2 Facebook users who try to find words associated with child sexual abuse content will see this pop-up notification urging them not to view these images and to seek help.


Facebook

Last year, Facebook said it analyzed child sexual abuse content reported to the National Center for Missing and Exploited Children.The company found that more than 90% of the content was the same or similar to previously reported content. Six video transcripts, containing more than half of the child exploitation content reported in October and November 2020.The fact that only a handful of content is responsible for many reports suggests that a greater understanding of the intent can help us prevent this retraction," Davis wrote in a blog post. It did another analysis showing that users share these images for reasons other than child abuse, including: "Evil or bad humor"A second tool, Facebook said the test was a kind of alert that will notify users if they try to share these malicious images. A security alert will let users know that if they share this type of content again, their account may be deactivated.The company said it was using the tool to help identify it. "Behavior signals" of users who may be sharing more malicious content.

This will help companies "educate them on why it is dangerous and encourage them not to share this information" publicly or privately, Davis said.Facebook also updates its child safety policy and reporting tools. The social media giant said it would pull Facebook profiles, pages, groups and Instagram accounts. "It is dedicated to sharing innocent childish pictures with captions, hashtags, or comments with signs of affection or inappropriate comments about children appearing in the pictures." Facebook users reported the content. You'll see an option to notify the social network that the photo or video is This allows companies to prioritize images for review.During the coronavirus outbreak, the image of online child sex abuse surged, according to Business Insider in January. From July to September, Facebook detected at least 13 million malicious images on its main social networks and Instagram.