Shropshire Star

AI to help Bumble detect and hide unwanted nude photos

The system will be capable of identifying inappropriate material with 98% accuracy.

Published
Last updated
Bumble's Private Detector

Dating app Bumble is adding artificial intelligence (AI) technology that is able to securely detect most nude images being sent to others and hide them from view.

Starting in June, a Private Detector feature will use AI machine learning to scan through images in real time, automatically blurring any inappropriate photos and alerting the end user, with an option to open the image or block it.

The app, which only allows women to make the first contact with matched men, is making the move in a bid to improve safety for its five million UK users and 55 million globally.

It will be capable of detecting images with 98% accuracy, the company said.

“The safety of our users is without question the number one priority in everything we do and the development of Private Detector is another undeniable example of that commitment,” explained Andrey Andreev, majority owner of Bumble, who will roll the feature out to other dating apps Badoo, Chappy and Lumen, which he also owns.

“The sharing of lewd images is a global issue of critical importance and it falls upon all of us in the social media and social networking worlds to lead by example and to refuse to tolerate inappropriate behaviour on our platforms.”

Founder and chief executive Whitney Wolfe Herd, who has worked with politicians in the app’s home ground of Texas to develop a bill that makes the sharing of lewd photos a punishable crime, said there is “limited accountability, making it difficult to deter people from engaging in poor behaviour”.

“The Private Detector, and our support of this bill are just two of the many ways we’re demonstrating our commitment to making the internet safer,” she said.

Sorry, we are not accepting comments on this article.