ChatGPT and its potential cybersecurity risks | Comment from Satnam Narang, Sr.Staff Research Engineer, Tenable

Satnam
Satnam Narang_Staff Research Engineer_Tenable

“ChatGPT has reportedly been used by low-skilled cybercriminals to develop basic malware and may be used by scammers to develop phishing scripts to be used as part of both phishing emails and dating and romance scams.

“It is not entirely impossible, but a lot less likely that ChatGPT will develop a professionalised piece of ransomware or other malicious software. It can, however, provide the basic foundation for a low-skilled cybercriminal to kick start their efforts and put them on the path towards success.

“Poorly constructed sentences or grammatical errors are one of the few tell-tale signs of phishing emails and dating app profiles. With the prevalence of Pig Butchering scams across social networks and messaging apps, ChatGPT could help fill the gap when it comes to writing more convincing profile bios for fake profiles, as it’s easy to spot some fake profiles due to poorly constructed bios. It could also be used to help facilitate scripts used by dating and romance scammers when trying to convince their potential victims to part ways with their money or cryptocurrency.

“ChatGPT does have some guardrails that might dissuade or prevent scammers with exactly what they need, but they may still find ways around it through how they frame their requests.

“We’re still in the early stages of seeing ChatGPT’s impact on a broader level, but it’s clear that, as with any new technology, cybercriminals will seek to find a way to abuse it for their own financial gain.” — Satnam Narang, Sr.Staff Research Engineer at Tenable.