Facebook’s used its machine learning prowess to filter hate speech and fake accounts— but now it’s turning its focus on fighting revenge porn. In an announcement last week, the company said that it’s introducing new technology to remove non-consensual intimate photos and videos off its platform. The social network claims that the new AI will help the company weed out nude and near-nude photos and videos before anyone reports them. Once the AI detects such content, a Facebook staffer will review it to see if it violates the site’s community guidelines. If they find that the content infringes the social network’s terms,…

This story continues at The Next Web

Or just read more coverage about: Facebook