If you’re browsing for ChatGPT-style apps on the App Store, proceed with caution.
ChatGPT’s massive popularity has led to a rise in copycats that mislead users. Ever since OpenAI launched an API for ChatGPT, developers have been able to build their own apps and products using the large language model (LLM). But developers have flooded the App Store with lookalikes that use shady tactics to garner subscriptions and ratings. Security researcher Alex Kleber discovered instances of multiple clone apps created by the same developer. So consumers might think they’re choosing between competing chatbot apps, when in actuality they’re just picking between two apps with identical code but just slightly different names and interfaces.
ChatGPT rolls out important privacy options
For the record, ChatGPT is free and doesn’t have a mobile app. For $20 per month you can pay for a ChatGPT Plus subscription which gives you availability even during peak hours and faster response times. But other than that, anyone can access ChatGPT’s core features. Also, while there isn’t an app, you can also easily use it on your phone’s browser.
If you’ve been on OpenAI’s official ChatGPT website, then many of these third-party apps will look quite familiar to you. Many of these apps that show up in search results on the App Store have a green color scheme and logo similar to OpenAI’s. It’s as if they are attempting to pose as the real deal for users seeking out a nonexistent official app.
There are legitimate AI chatbot apps utilizing OpenAI’s API and providing value. And, perhaps, those type of value-adding apps are of interest to you. So if you’re in the market for an app like this, here’s what to look out for if you don’t like the idea of sketchy, ChatGPT-style app knockoffs.
Paywall windows with no close button
“AI Chat Bot- Writing Assistant” and “ChatBot Powered by GPT-4” are both desktop apps that have a free version and a paid version, charging $5.99 per week and $4.99 per week respectively. But the apps deliberately conceal use of the free version by not providing an obvious close button for the paywall that pops up when one downloads the apps. (AI Chat Bot- Writing Assistant has a tiny gray “Skip Offer” button at the bottom of the screen.) Screenshots provided by Kleber show how there’s no way to close the window, forcing the user to either subscribe or quit the app.
Credit: Alex Kleber
Credit: Alex Kleber
Mashable found another example of this with a mobile app called “AI Chat – Chatbot AI Assistant.” The paywall immediately pops up upon downloading the app, but the “X” button doesn’t appear for several seconds. This could lead users to think their only option is to tap the “Start Free Trial” button, which starts charging the user after three days.
Credit: Mashable
Sketchy means of getting a rating
This same app (AI Chat – Chatbot AI Assistant) requests that the user review the app immediately after installing it. This is bad practice according to App Store guidelines, which say “Avoid showing a request for a review immediately when a user launches your app, even if it isn’t the first time that it launches.”
Credit: Mashable
The desktop apps that Kleber looked into also did this, and also prompted the user to rate the app every time they asked a question in the chat. Immediately asking for a review when the user hasn’t even tried the app yet is a tactic typically used by scammers to quickly achieve higher rankings in the App Store. Users are forced to rate the app just so they can start using it.
The ChatGPT copycats have all or mostly five-star ratings, which is a red flag. Sure, plenty of great apps have appropriately great ratings, but if every review is five stars with emphatic-yet-generic comments, the comments are likely fake.
Same developer, different apps
Many of these apps were created by the same developer using a multitude of different developer accounts. In Kleber’s report, he noticed that one developer in particular had deployed ChatGPT clones across eight separate developer accounts.
Why does this matter? It’s not against Apple’s terms of service to have more than one developer account, and there are legitimate reasons to do so. Say, for example, a developer created both a social networking app and a niche e-commerce platform, and wanted to make them two distinct entities.
But in many cases these ChatGPT apps weren’t distinct. They were cloned, maybe with slight alterations, treated as separate apps, and published in the App Store under separate names. A developer doing this could fill the App Store with multiple versions of the same app in order to dominate search results for AI chat applications, stifling any potential competition. Consumers are left with false choices, not knowing that the AI chatbot apps they are trying to choose from are actually carbon copies with the exact same developer behind them.
And cheating the system in ways like that is against Apple’s App Store policies.
Cut and dried scams? Not so fast.
Most of the time, any AI chatbot or writing app you see will be powered by OpenAI. OpenAI makes its APIs available and fairly low-cost so that developers can use them to build things. And as long as these developers don’t claim to be affiliated with OpenAI or label their products, say, “The Official ChatGPT App,” they’re probably in the clear in terms of legality and app store rules.
And just recreating a free experience in the form of an app doesn’t make a product shady. Some developers have created apps that provide language model access, but with a more streamlined UI or various different prompt templates. There’s a value-add there.
But in general, those who download these ChatGPT copycats don’t appear to be getting much if anything that isn’t already available at the ChatGPT mobile site. And those who pay money to use these apps likely just haven’t read the fine print.