Back

Malware disguised as ChatGPT apps are being used to lure victims, Meta says

Facebook’s parent company Meta has issued a warning that hackers are taking advantage of people’s interest in ChatGP and other generative AI apps to trick them into installing malware that aims to provide AI functionality.

Since March, Meta has discovered about 10 malware families that use AI themes to compromise business accounts on the Internet, including business social media accounts, and has blocked more than 1,000 unique malicious URLs with ChatGPT themes because be shared on their platforms.

“Over the past few months, we’ve been investigating and taking action against strains of malware taking advantage of people’s interest in OpenAI’s ChatGPT to trick them into installing malware that purports to provide AI functionality,” he said. say Meta in a block.

Meta detected strains of malware such as DuckTail and NodeStealer in ChatGPT browser plugins and productivity tools, attributed to hackers based in Vietnam.

DuckTail steals browser cookies

One of the strains of malware it has Increasingly targeting victims with AI-themed decoys is DuckTail. DuckTail steals browser cookies and hijacks Facebook sessions to retrieve victim account information, such as location data and two-factor authentication codes. Threat actors use the malware strain to hijack Facebook business accounts that the victim has access to, in order to gain access to Facebook ad accounts.

“In its latest iteration, DuckTail operators, likely in response to our 24/7 detection that ended stolen sessions, began automatically granting enterprise admin permissions to related action requests with ads sent by attackers as an attempt to speed up their operations before we block them,” Meta said.

Copyright © 2023 IDG Communications, Inc.

Source link
Recent reports uncovering a malicious plot to lure computer users with unauthorized ChatGPT apps have left many wondering what action they should take. Meta’s research team have been quick to comment on the matter and are warning those who use ChatGPT potential vulnerabilities in these apps.

What is ChatGPT? It is an artificial intelligence technology which enables contextual chats with virtual agents. It is used to provide automated customer support, create smarter bots, and develop automated customer relations.

Meta’s research team has unearthed a “malware disguised as ChatGPT apps” which is used to exploit unsuspecting users. The bogus app poses as a ChatGPT service and entices victims with the promise of free rewards. Once infected, users are urged to complete surveys and tasks, though all efforts result in nothing. Worse still, their personal data is exposed in the process.

Meta recommends avoiding downloading “official-looking ChatGPT apps” which are not reviewed by the Google Play Store or Apple App Store. Cybersecurity experts also advise users to always use a secure connection whenever downloading applications, and to update their operating system and security programs to ensure they are secure against the latest threats.

At Ikaroa, our team of experts are dedicated to providing secure applications and services to our users. Our skilled professionals have the technical capabilities to detect and identify malicious software, and we stay abreast of the latest advancements in source code security to ensure we can respond to any cyber threats as quickly as possible. We are committed to offering our customers complete peace of mind when it comes to their digital security.

ikaroa
ikaroa
https://ikaroa.com

Leave a Reply

Your email address will not be published. Required fields are marked *