Back to all posts

Your Dating App Data Is Training AI Without Your Consent

2025-07-22Paige Collings4 minutes read
AI
Privacy
Dating Apps

Staying safe while dating online shouldn't be solely your responsibility. Dating apps have a duty to prioritize user privacy by default, and our laws should compel them to put user safety above profits. Unfortunately, many popular dating apps are taking shortcuts, compromising your privacy and security to rush out new AI tools—often by using your most personal information to train them.

The AI-Powered Features Invading Your Dating Life

The race to integrate AI is heating up across the dating app landscape. Grindr has big plans for its AI wingman bot, Bumble launched AI Icebreakers, Tinder introduced AI to help choose your profile pictures, OKCupid teamed up with Photoroom to erase exes from photos, and Hinge launched an AI tool to help you write better prompts.

While these features sound innovative, the privacy implications are enormous. Dating apps encourage us to share incredibly sensitive data, from our sexual preferences to our precise location. When this data falls into the wrong hands, the consequences can be devastating, as has already been demonstrated, especially for members of the LGBTQ+ community.

This is why genuine, opt-in consent is crucial before any personal data is used to train AI. Users should have a reasonable expectation that their private conversations and photos will not be used for any purpose they haven't explicitly agreed to.

Bumble's AI Icebreakers and the GDPR Challenge

In late 2023, Bumble introduced AI Icebreakers to its 'Bumble for Friends' feature. Powered by OpenAI’s ChatGPT, the tool was rolled out without ever asking users for their consent. Instead, the app presented a pop-up that repeatedly nudged users to click 'Okay,' reappearing every time the app was opened until they finally gave in.

This practice is problematic enough, but Bumble also shared personal user data with OpenAI to power the feature. In response, the European nonprofit noyb filed a complaint with the Austrian data protection authority, alleging several violations of GDPR.

noyb's complaint specifically claims that Bumble:

  • Failed to inform users about the processing of their personal data for the AI feature.
  • Confused users with a “fake” consent banner.
  • Lacked a legal basis for processing the data, as it never obtained real consent.
  • Processed sensitive data (like sexual orientation) without the explicit consent required by Article 9 of GDPR.
  • Failed to properly respond to a user's data access request.

Grindr's AI Wingman and Third-Party Data Risks

Grindr recently launched its own AI wingman, a chatbot designed to track matches and suggest date locations, with future plans to send messages and make reservations on the user's behalf. This feature is being built with a third-party company, Ex-human, raising immediate data-sharing concerns.

Grindr has stated that user data will remain on its own infrastructure and that it will ask for permission to use chat history for AI training. However, given Grindr's terrible track record on privacy—including a recent lawsuit for allegedly revealing users' HIV status—these promises may not be enough. With direct messages stored on the company's servers, users must simply trust that their data won't be misused, a trust Grindr has not earned.

AI Photo Selection A Closer Look at Your Camera Roll

Both Tinder and Bumble have also launched AI tools to help you pick better profile pictures. Tinder's Photo Selector uses facial recognition to scan your device's camera roll and select photos based on the company's internal "learnings" about what makes a good profile. Users are not told what these parameters are, nor is there a clear policy about the potential collection of biometric data or how camera roll images are stored and used.

Putting users in control of their data is fundamental to protecting privacy. Everyone deserves the right to decide how their data is used. When it comes to something as personal as dating profiles and private messages, all companies must require explicit opt-in consent before using that data for AI training. Finding a connection shouldn't require you to sacrifice your privacy.

Beyond corporate responsibility, we need comprehensive consumer privacy legislation. Such laws would limit the amount of personal data that companies can collect in the first place, preventing it from being sold, breached, or used to train AI models without your knowledge.

We must urge these companies to put people before profit and build real privacy protections into their platforms. The safety of all users, especially those in vulnerable communities, depends on it.

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.