Skip Navigation

May 12, 2025 |

New ‘Noodlophile’ infostealer disguised as AI video generator

Loading table of contents...

A new cyber threat has emerged involving counterfeit AI video generation tools that distribute a malware strain known as 'Noodlophile.' These deceptive websites, with names like "Dream Machine," are promoted through prominent Facebook groups, masquerading as sophisticated AI platforms capable of creating videos from user-uploaded files. However, instead of delivering the promised content, these sites provide a ZIP file containing a malicious executable disguised as a video file.

Upon execution, the malware initiates a multi-stage infection process. The ZIP archive includes an executable file misleadingly named to appear as a video (e.g., 'Video Dream MachineAI.mp4.exe') and a concealed folder housing additional components necessary for the malware's operation. This tactic exploits the common Windows setting that hides file extensions, making it easier for users to mistake the malicious file for a legitimate video.

ThreatRoundUp_SignUp_Simplifiedx2

Stay on top of emerging threats.

Sign up to receive a weekly roundup of our security intelligence feed. You'll be the first to know of emerging attack vectors, threats, and vulnerabilities. 

Sign up

Security researchers have identified Noodlophile as part of a malware-as-a-service scheme, being sold on dark web forums. Often bundled with services labeled "Get Cookie + Pass," this malware is linked to operators who communicate in Vietnamese. The campaign underscores the evolving strategies of cybercriminals, who are now leveraging the popularity of AI tools to lure victims into downloading harmful software.

Source: Bleeping Computer

Analysis

This isn’t the first time a threat actor has disguised an infostealer as an AI-related tool. For example, in February 2025, threat actors were observed promoting two infostealers, disguised as DeepSeek AI tools. The two packages were named "deepseeek" and "deepseekai" in an obvious attempt to resemble the legitimate DeepSeek AI tool.

In November 2024, a different threat actor uploaded malicious packages mimicking ChatGPT and Claude AI that contained a modified version of JarkaStealer, an infostealer capable of exfiltrating data from browsers and applications like Telegram and Discord. Before detection and removal, the malicious packages were downloaded over 1,700 times across more than 30 countries.

Given the effectiveness of the technique, it’s unlikely threat actors will stop disguising malware as popular AI tools anytime soon. Thus, users should be cautious of unsolicited advertisements and verify the legitimacy of online services before downloading or executing files.

Mitigation

Field Effect’s Security Intelligence team constantly monitor the cyber threat landscape for the discovery of infostealers like Noodlophile. This research contributes to the timely deployment of signatures into Field Effect MDR to detect and mitigate these threats. Field Effect MDR users are automatically notified when threat-related activity is detected in their environment and are encouraged to review these AROs as quickly as possible via the Field Effect portal.

Related Articles