Developer Offer
Try ImaginePro API with 50 Free Credits
Build and ship AI-powered visuals with Midjourney, Flux, and more — free credits refresh every month.
Deceptive AI Apps Plague Apples Mac App Store Again

The Return of Deceptive AI Apps
Just two years ago, the launch of OpenAI’s GPT-4 API triggered a gold rush on the App Store. Suddenly, AI-powered apps for everything from productivity to nutrition trackers were dominating the charts. While the initial frenzy has cooled and Apple has toughened its stance on misleading apps, the problem hasn't vanished. Deceptive clones are finding their way back, posing new risks to users.
A Case Study in Impersonation
This week, a concerning example surfaced when security researcher Alex Kleber discovered a misleading AI chatbot at the top of the Business category on the Mac App Store. This "AI ChatBot" app brazenly impersonates OpenAI's branding, using a similar logo, name, and design to lure in users.
Further investigation revealed that the same developer, with a company address in Pakistan, is behind another nearly identical app. Both applications share the same interface, screenshots, and even a support website that leads to a basic Google Page. Despite Apple's efforts to purge copycats, these two managed to pass the review process and gain prominence on the U.S. Mac App Store. This serves as a stark reminder that an app's store ranking or approval is not a reliable indicator of its safety or legitimacy.
Sketchy GPT clone on the U.S. Mac App Store – 9to5Mac
The Hidden Dangers of Data Collection
The real threat from these apps lies in their data collection practices. A recent report by Private Internet Access (PIA) highlighted a disturbing lack of transparency in many productivity apps. The report found one popular AI assistant using the ChatGPT API was secretly collecting far more data than it disclosed on its App Store page. While the listing claimed it only gathered messages and device IDs, the privacy policy admitted to collecting names, emails, and extensive usage statistics. This kind of data is often sold to data brokers or used for other malicious purposes.
When a GPT clone app collects user inputs and ties them to real names, it creates a privacy nightmare. Imagine a massive database filled with personal conversations, all linked to the individuals who wrote them, managed by a shadowy company with a meaningless privacy policy. This isn't a hypothetical scenario; it's happening right now.
Apples Privacy Labels Arent Foolproof
One might think Apple’s App Store privacy labels would prevent this, but they have a critical weakness. These labels are self-reported by developers, and Apple relies on them to be honest. There is no robust system in place to verify the accuracy of these claims, allowing unscrupulous developers to stretch the truth or lie outright.
A Final Word of Caution
It is vital to remember that these risky applications are still active, collecting unknown amounts of information from users who don't know any better. The potential for privacy violations is immense. Always be cautious and think twice before sharing personal information with any third-party AI app.
Stay safe ✌️
Compare Plans & Pricing
Find the plan that matches your workload and unlock full access to ImaginePro.
| Plan | Price | Highlights |
|---|---|---|
| Standard | $8 / month |
|
| Premium | $20 / month |
|
Need custom terms? Talk to us to tailor credits, rate limits, or deployment options.
View All Pricing Details

