A sophisticated phishing campaign is deceiving iPhone users by impersonating trusted AI brands, specifically OpenAI’s ChatGPT and Google’s Gemini. Attackers are sending fraudulent emails that entice recipients to download counterfeit applications from the Apple App Store. This operation capitalizes on the reputation of well-known AI platforms, which millions rely on daily, to create a façade of legitimacy.
The phishing emails are crafted to resemble genuine communications from ChatGPT and Gemini. They target business users, marketers, and social media professionals, promoting the fake apps as essential tools for advertising management and business enhancement. Each email contains a direct link that leads recipients to a seemingly authentic app listing on the Apple App Store, a platform that most users trust without question.
During an investigation, analysts from SpiderLabs identified two fraudulent listings on the Apple App Store. The first was GeminiAI Advertising with the identifier id6759005662, and the second was Ads GPT with the identifier id6759514534. Both apps were discovered on the Australian App Store storefront, illustrating the global reach of this phishing scheme.
Upon launching either application, users are met not with AI functionalities but with a deceptive Facebook login screen. This screen prompts users to enter their credentials, under the pretense of linking an account for advertising purposes.
The tactics employed in this campaign represent a notable shift among credential-harvesting threat actors. Instead of relying on fake websites or malicious email attachments, these attackers strategically infiltrated an official app marketplace, enhancing the perceived legitimacy of their operation. The Apple App Store is generally viewed as a secure environment, which amplifies the risk for unsuspecting users. The presence of these malicious apps, even temporarily, underscores the challenges associated with vetting applications on large-scale digital distribution platforms.
Understanding the Credential Theft Mechanism
The effectiveness of this phishing campaign hinges on a meticulously orchestrated trust chain that begins long before victims engage with the fake app. An email purporting to originate from a recognized AI platform sets the expectation that the linked tool is credible and beneficial. By the time victims navigate to the App Store and install the application, they have passed through multiple credibility checkpoints, reinforcing their belief that they are interacting with a legitimate product.
Once installed, the application bypasses any authentic onboarding process and swiftly presents a Facebook login screen. This interface closely mimics Facebook’s native design, providing no clear indication to the average user that something is amiss. Credentials entered through this fraudulent interface are captured in real time and transmitted to servers controlled by the attackers. This data breach grants threat actors access to personal Facebook profiles, ad accounts, and pages associated with the compromised accounts, maximizing the potential rewards for these financially motivated attackers.
To mitigate the risks associated with such phishing attempts, users receiving unsolicited emails promoting AI-powered applications should verify the sender’s actual email address instead of relying solely on the display name. It is advisable to cross-check the developer name, read user reviews, and look for inconsistencies in the app description before downloading any application. Enabling two-factor authentication on Facebook and other social media accounts can provide an additional layer of protection, even in cases where passwords have been compromised.
Organizations are encouraged to raise awareness about this type of phishing campaign within their teams. Employees should be reminded to report any suspicious emails promoting software downloads, regardless of how familiar the impersonated brand may appear. By being vigilant, users can better protect themselves from these increasingly sophisticated phishing schemes.








































