1 min read

Compromised ChatGPT Accounts on Dark Web: Comment by Satnam Narang, Sr. Staff Research Engineer, Tenable

“Information stealing malware, such as Raccoon, Vidar and Redline are capable of stealing sensitive information stored in web browsers, which includes user credentials (username/email and password), session cookies and browser history. Credentials tied to finance, social media, and others are likely to be compromised. The reporting from Group-IB reflects the increased interest in generative AI tools like ChatGPT usage around the world, and as a result, ChatGPT user credentials are being harvested by information-stealing malware.

Satnam Narang, Sr. Staff Research Engineer, Tenable

“Asia-Pacific had over 40,000 compromised accounts between June 2022 and May 2023. India is the most affected country with over 12,000 stolen credentials being sold. The biggest threat to users of ChatGPT through exposed credentials is the exposure of conversations between users and ChatGPT, which may include other sensitive information, whether it’s personally identifiable information, or workplace-related information, including sensitive company data.

“Another area of concern is password reuse. Historically, we know users tend to reuse passwords across multiple sites, so if users have had their ChatGPT account credentials compromised, it’s possible that other accounts are at risk as well if users reused their ChatGPT password elsewhere.

“At this time, OpenAI has temporarily paused the enrolment of two-factor authentication for ChatGPT. Once enrolment has been re-enabled, users should add it as an additional security measure. However, it’s important to note that information-stealing malware also steals session cookies, which can be used to bypass account security features like two-factor authentication if valid ChatGPT session cookies are also being sold on the dark web. Irrespective of this, we still advise users to enable this feature on their ChatGPT accounts.” – Satnam Narang, Sr. Staff Research Engineer, Tenable.

Leave a Reply