Developers are again publicly pointing to cases where Apple has failed to keep scam apps out of the app store. The apps in question charge users unusual fees and derive revenue from legitimate or higher quality apps. While Apple has previously come under fire for failing to block the publication of such apps, developers this week complained that Apple was actually actively promoting some of these apps.
Apple’s Australian App Store published a story called “Slime Relaxations” highlighting a particular category of apps. But according to for some developers and observers, some of those apps have very high subscription costs, despite not offering much functionality.
Take, for example, an app with the convoluted name of “Jelly: Slime Simulator, ASMR”. Unless users subscribe, the app is filled with ads; it plays more than one in a row before the user can interact with it in any meaningful way. A report from MacRumors said the app “has a $13 a week subscription” to remove those ads. (When we downloaded the app ourselves, we were prompted to subscribe for nearly half, at $7.99 per week. It’s unclear to us whether prices have changed since initial reports or whether it’s a regional price difference.)
In either case, as MacRumors also noted, Apple’s App Store Review Guidelines explicitly say that Apple will “reject expensive apps that try to cheat users with irrationally high prices.” That’s obviously subjective and open to interpretation, but some developers claim that this app and others featured in the “Slime Relaxations” story cross that line.
These are not new problems. In February, developer Kosta Eleftheriou pointed out a scam app for the Apple Watch backed by fake reviews. Apple DELETED the offensive app after Eleftheriou’s observations were widely reported on Twitter and in the media. But Eleftheriou and other developers continued to identify even more scam apps.
Apple defended its efforts to keep scam apps out of the App Store in a statement to The Verge when the press reported on Eleftheriou’s findings:
We take feedback about fraudulent activity seriously, investigate every report and take action. The App Store is designed to be a safe and trusted place for users to get apps, and a great opportunity for developers to be successful. We do not tolerate fraudulent activity on the App Store and have strict rules against apps and developers attempting to cheat the system. In 2020 alone, we terminated over half a million developer accounts for fraud and removed over 60 million user reviews that were deemed spam. As part of our ongoing efforts to maintain the integrity of our platform, our Discovery Fraud team is actively working to remove these types of violations and are constantly improving their process.
Apple continues to play whack-a-mole with these apps, but several developers have complained both publicly and privately that the company is taking too long. A developer with whom we exchanged emails claimed that when they discovered a scam app stealing assets from their own legitimate app that was clearly designed to siphon users off the genuine app, it took Apple 10 days to app, while Google only “1-2 days” on the Android side. The app was allowed back on Apple’s App Store after the stolen assets were removed. During the long wait, the developer of the legitimate app lost a significant number of users and revenue, while the developer of the pirated app benefited.
While Apple is battling legal battles to prevent third-party app stores from making their way onto iOS, as those alternative app stores may be less secure than Apple’s, claims from developers that scam apps could slip through Apple’s defenses undermine. The company has enough incentives to stop the scam apps and the will seems to be there. But the processes Apple uses to achieve that goal seem far from perfect, and as a result, both users and legitimate developers are at risk.
Given what’s at stake for Apple in addressing this issue, it’s hard to imagine that the examples developers have uncovered are instances of malice rather than incompetence. But for developers and users, the consequences can often be the same.