News

Bumble Is Introducing A New Feature To Block Unwanted Pictures In Your Chats

by Lizzy Rosenberg

If you're a millennial, chances are you've downloaded some sort of dating app at some point in your life. And if that's the case, you've most likely had one (or more) users send you — well — inappropriate pictures, if you catch my drift. In some cases these photos may be welcomed with open arms, but what about when they are completely unsolicited? That's where Bumble's new "Private Detector" feature will come in clutch when it automatically blurs out "lewd" pictures, so you can decide if you'd like to see them for yourself.

On Wednesday, April 24, Bumble shared news about its latest feature aimed at protecting its users. Launching in June 2019, the technology called "Private Detector" will roll out as a new protection for Bumble users, according to a press release from the brand. Private Detector uses Artificial Intelligence (AI) to pick up on all the graphics sent in a private chat. With 98 percent accuracy, it can determine in real time if a photograph is "inappropriate" or not, and if it is — in fact — deemed "lewd," the feature will automatically blur out said image.

Once someone sends one of these images, users will be alerted they were sent something deemed potentially inappropriate, giving them the opportunity to choose if they would like to view the image or block it. Then, if you feel so inclined, you can even report the image.

In the press release, Andrey Andreev, Bumble's majority owner and active partner, said Bumble values the safety of its users, and the creation of this new feature is an example of its commitment to instilling a sense of protection. Andreev is also the founder of dating app group which includes Badoo, Bumble, Chappy, and Lumen. All of the apps in Andreev's group will also be equipped with the Private Detector this summer.

Courtesy of Bumble

According to the press release, Andreev said of the new feature:

The safety of our users is without question the number one priority in everything we do and the development of ‘Private Detector’ is another undeniable example of that commitment. The sharing of lewd images is a global issue of critical importance and it falls upon all of us in the social media and social networking worlds to lead by example and to refuse to tolerate inappropriate behavior on our platforms.

To further efforts to protect app users, Whitney Wolfe Herd, the founder and CEO of Bumble, has been working hard alongside Texas lawmakers to hopefully make sharing inappropriate photos a crime, according to the press release. Thus far, it's unanimously passed the Committee on Criminal Jurisprudence, and as of April 24, it will be debated within the Texas House of Representatives.

According to the press release, Wolfe Herd said:

The digital world can be a very unsafe place overrun with lewd, hateful and inappropriate behavior. There’s limited accountability, making it difficult to deter people from engaging in poor behavior. I really admire the work Andrey has done for the safety and security of millions of people online and we, along with our teams, want to be a part of the solution. The 'Private Detector,' and our support of this bill are just two of the many ways we’re demonstrating our commitment to making the internet safer.

So, there you have it. You will soon have the chance to not be, um... bombarded with unsolicited "private" pics. You still can still look at them if you'd like, but it will soon be almost totally avoidable. What a relief.