Your ChatGPT credentials are at risk by hackers with info-stealing malware

ChatGPT privacy settings
(Image credit: Future)

What you need to know

  • Group-IB, a cybersecurity firm has discovered that 101,134 ChatGPT credentials have been traded away in dark web marketplaces over the last 12 months.
  • The attackers leveraged info-stealing malware to gain access to these credentials.
  • Users should change their passwords regularly to avoid falling victim to these attacks.

According to a new report by Group-IB, ChatGPT users could be susceptible to compromise from hackers as spotted by our sister site, TechRadar.

Per the Singapore-based cybersecurity firm's findings, hackers leveraged info-stealing malware to gain access to the users' login credentials. The report further detailed that approximately 101,000 ChatGPT credentials have been affected by this malicious campaign.

Group-IB found 101,134 ChatGPT credentials via its Threat Intelligence platform that were buried in the info-stealing malware that had been traded away in the dark web market in a period of 12 months. 

The info-malware campaign hit users based in the Asia-Pacific region hard, with the statistics accounting for more than two in every five cases. The report further detailed that a quarter of the ChatGPT credentials affected were from May 2023. 

India topped the list with 12,632 compromised credentials, followed by Pakistan and Brazil. There were also 2,995 instances in the US, which came in at number six. 

Essentially, attackers leverage such campaigns on unsuspecting users to gain access to their saved passwords, cookies, browsing history, and more. In recent times, instant messengers as well as emails have also risen to the ranks, as attackers are showing more interest and finding new techniques to gain access to them.

In the survey, approximately 78,000 cases were traced back to the Raccoon info stealer proved, a clear indication that most hackers were using it to compromise the user's security. It was then followed by Vidar with under 13,000 cases, and Redline with close to 7,000 cases.

Given that ChatGPT’s standard configuration retains all conversations, this could inadvertently offer a trove of sensitive intelligence to threat actors if they obtain account credentials.

Dmitry Shestakov, Group-IB Head

Amid the rising concerns over such threats and the potential harm they would cause in the long run, more users are signing up for the platform. Unfortunately, they are not well equipped with elaborate security measures that will provide them with a better fighting chance against such campaigns. 

As such, the cybersecurity firm recommends the change of passwords alongside more sophisticated security measures like two-factor authentication (2FA) on these platforms on a regular basis in a bid to avoid falling victim to such attacks and more. 

Jake Moore, Global Cyber Security Advisor at ESET further revealed that non-technical users might not realize how much information is in their ChatGPT accounts and the damage it would cause if it fell into the wrong hands. 

"It stores all input requests by default and can be viewed by those with access to the account. Furthermore, info stealers are becoming more prominent in ChatGPT compromises and even used in malware-as-a-service attacks. Info stealers focus on stealing digital assets stored on a compromised system looking for essential information such as cryptocurrency wallet records, access credentials, and passwords as well as saved browser logins." 

Kevin Okemwa
Contributor

Kevin Okemwa is a seasoned tech journalist based in Nairobi, Kenya with lots of experience covering the latest trends and developments in the industry. With a passion for innovation and a keen eye for detail, he has written for leading publications such as OnMSFT, MakeUseOf, and Windows Report, providing insightful analysis and breaking news on everything revolving around the Microsoft ecosystem. While AFK and not busy following the ever-emerging trends in tech, you can find him exploring the world or listening to music.

  • GraniteStateColin
    Just checking this here on Windows Central: does this affect users who only access it as Bing Chat, or are those MS account credentials safe? I didn't see anything about Bing or Microsoft in the article, so I assume this only applies to users who access directly through ChatGPT, but given how many are not clear on the differences between ChatGPT and Bing Chat, some clarification on this would be helpful. Also, if this is limited to ChatGPT, this could be an important selling point for Bing Chat over ChatGPT: same (similar?) service with better security.
    Reply