A discreet and silent function that aims to protect women above all
Debuting in the States the new function dedicated to security on Tinder developed in collaboration with Noonlight. Especially oriented to the female audience, it will allow ask for help in a discreet and silent way in the event that the situation becomes dangerous during the appointment set with another user known on the platform.
To communicate the news that could soon spread to other countries, Match Group was the company that owns Tinder as well as many other dating apps such as Match itself, but also OkCupid, Plenty of Fish and many others. In addition to the button to request assistance, other solutions will be introduced such as the location tracking, verification of photos uploaded to the profile and an updated in-app security center.
The decision follows a recent controversy following the report from ProPublica and Columbia Journalism Investigations last December which revealed how many sexual predators known to law enforcement agencies had managed to use the dating platforms without any particular problem. The various apps basically left individual users responsible for their own security. But now everything is ready (finally) to change.
We will start on January 28th with Tinder, that the Match Group software, with its 340 million downloads (of which 5.7 paid premiums). Noonlight, the reference point for the supply of emergency systems for applications, already collaborates actively with Uber and Lyft, but also Alexa, Google Home and Fitbit, just to mention the best known names.
In practice, users who find themselves in a potentially dangerous situation will be able to press a button on their smartphone and receive immediate assistance. Before the appointment, they can be provided information such as place and time as well as on the person you are about to meet.
In addition, an artificial intelligence based system will be introduced which will verify that the photos uploaded to the profile correspond to the real user with a procedure to follow to check the shots (through selfies with the same pose) and quick commands for report harassment within the messaging. Finally, machine learning algorithms will understand when a chat can be unpleasant and the author may also have the possibility of canceling in the event of an afterthought.