Facebook is expanding its use of artificial intelligence to moderate the content that gets uploaded to its platform in order to ensure that the community guidelines are not violated. The company has revealed that it’s now using a new artificial intelligence tool which is capable of proactively detecting and flagging revenge porn.

Revenge porn can be described as the intimate photos and videos of someone that are posted without their consent. It has been criminalized in some states and is obviously not allowed on social media platforms. Facebook’s AI tool will ensure that none slips through on both Facebook itself and Instagram. Unlike current filters, this tool will be able to detect “near-nude” content. All content that the tool flags is sent for review to a human moderator.

This is a significant change to how things were handled previously. Facebook and Instagram users has to report revenge porn themselves. “Often victims are afraid of retribution so they are reluctant to report the content themselves or are unaware the content has been shared,” said Facebook’s global head of safety Antigone Davis. The new tool will thus bring about an improvement why automatically flagging such content and sending it to human moderators for review before anyone even reports it.

Facebook has offered little detail on how this AI tool actually works, though, and what signals it picks up in order to flag a photo or video as revenge porn.

Filed in Web. Read more about .

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading