WeTransfers AI Data Grab Sparks Artist Backlash
The Spark of the Outrage
If you've ever needed to send a file larger than 20MB, chances are you've used or heard of WeTransfer. You may have also heard the recent social media uproar, led by artists, who began sharing screenshots of the company's updated terms of service. They highlighted a concerning clause that appeared to grant WeTransfer the right to use all materials sent through its service, without any payment or regard for the uploader's privacy.
Deconstructing the Controversial Clause
This author, a long-time user of the service, decided to dig into WeTransfer's legalese to understand the nature of the recent outrage.
The company's new terms, published on July 14, would have granted WeTransfer "a perpetual, worldwide, non-exclusive, royalty-free, transferable, sub-licensable license to use your Content for the purposes of operating, developing, commercialising and improving the Service or new technologies or services, including to improve performance of machine learning models that enhance our content moderation process, in accordance with the Privacy & Cookie Policy”.
For anyone wondering how far this license could reach, the terms specified "the right to reproduce, distribute, modify, prepare derivative works based upon, broadcast, communicate to the public, publicly display, and perform Content." The document explicitly noted that users would receive no compensation for their content being repurposed by the company.
A Dystopian Turn for Digital Privacy
One might argue that this is simply the price of using a free service. However, the sheer scale of the privacy violations in WeTransfer's new terms was staggering. The author reflected on sending a 200-page Green Card petition with confidential letters to immigration lawyers via the service, and sending private photographs for a collection. The fear became: what if a version of a unique work appeared in a WeTransfer banner ad?
As an artist, the author was already protective of their work, but the AI clause introduced a new fear: "Will they make ugly and derivative coasters out of my photographs?"
These self-serving terms bring to mind dystopian science fiction like The Matrix, where machines use people as batteries without their consent. In these stories, the ultimate evil is the invasion of privacy, the hijacking of agency, and the lack of informed consent.
Courtesy Warner Bros.
WeTransfer's Response and Lingering Questions
For a company that built its brand on supporting artists and not being a typical tech giant, this move felt like a betrayal. Following the intense online backlash, the Amsterdam-based company updated the problematic clause. It now reads: "In order to allow us to operate, provide you with, and improve the Service and our technologies, we must obtain from you certain rights related to Content that is covered by intellectual property rights. You hereby grant us a royalty-free license to use your Content for the purposes of operating, developing, and improving the Service, all in accordance with our Privacy & Cookie Policy.”
While this seems like a step back, the company effectively still claims the right to use any content transferred via its services. Although EU users can request data deletion under GDPR, WeTransfer's own Privacy and Cookie Policy contains a significant loophole. It states the company can refuse deletion if its "interest in using the information outweighs your wish for its deletion," particularly to protect its services from fraud.
In a blog post addressing the controversy, WeTransfer clarified that the AI training clause was for potential future use and has since been abandoned. The post also pointed out that its terms already contained broad language allowing it to use, reproduce, and create derivative works from user content long before the AI mention was added.
The Real Cost of Free Services
Essentially, WeTransfer already had the right to repurpose user content for a long time, and most of us were unaware or indifferent until AI training was specifically mentioned. This raises a compelling question: are we only uncomfortable with our data being exploited when the process is automated by machines?
Digital companies have been monetizing our data for two decades, often without sophisticated AI. As the online backlash against machine learning continues, it highlights how little many of us understand about the technology we use daily. It's a stark reminder of the old saying: "If the product is free, you are the product."