Blog, News & Press Releases - Field Effect

Infostealers Disguised as DeepSeek AI Tools

Written by Field Effect Security Intelligence Team | Feb 25, 2025 12:17:27 PM

Threat actors have been observed promoting two infostealers, disguised as DeepSeek AI tools, available as packages that can be downloaded by developers from the Python Package Index (PyPI) repository.

The two packages, named "deepseeek" and "deepseekai" in an obvious attempt to resemble the legitimate DeepSeek AI tool, were uploaded by an account that was created in June 2023 but had no activity since. The packages were both uploaded within minutes of each other on January 29, 2025, and were downloaded and likely installed by over 200 developers shortly thereafter.

Once the malicious packages are executed on the victim’s machine, they collect user and system data, and environmental variables such as API keys, database credentials, and infrastructure access tokens. The collected data is then exfiltrated to a command and control (C2) server using a legitimate automation platform called Pipedream.   A threat actor could then go on to use the stolen data to gain unauthorized access to the victim’s services and accounts.

Once the two packages were discovered to be trojanized infostealers, they were reported to PyPI who promptly removed then from the platform.

Source: Bleeping Computer

Analysis

This attack demonstrates how even sophisticated users like developers can be tricked into downloading and installing malware with clever social engineering tactics and malware obfuscation. In this case, the threat actor took advantage of DeepSeek AI’s recent popularity and developer interest in integrating the tool into software to increase the likelihood of the campaign’s success.

Due to the open nature of PyPI, which allows anyone with an account to upload a package, it’s often used as an attack vector to specifically target developers. This is because threat actors can not only gain access to the developer’s system, but also the chance to implant malicious code into whatever software the developer may be working on, ultimately leading to a supply chain attack.

This isn’t the first time a threat actor has disguised a malicious PyPI package as an AI tool. In November 2024, threat actors uploaded malicious packages mimicking ChatGPT and Claude AI that contained a modified version of JarkaStealer, an infostealer capable of exfiltrating data from browsers and applications like Telegram and Discord. Before detection and removal, the malicious packages were downloaded over 1,700 times across more than 30 countries.

Mitigation

Field Effect’s Security Intelligence team constantly monitor the cyber threat landscape for threats related to open-source code repositories. This research contributes to the timely deployment of signatures into Field Effect MDR to detect and mitigate these threats. Field Effect MDR users are automatically notified when threat-related activity is detected in their environment and are encouraged to review these AROs as quickly as possible via the Field Effect portal.

Field Effect recommends that any users who downloaded and installed either of the two malicious PyPI packages to immediately remove them and rotate their API keys, authentication tokens, and passwords, as they may now be compromised.

Related Articles