Skip links
Generative AI in the hands of criminals

The potential danger or criminals using generative artificial intelligence

Criminals’ potential use of generative artificial intelligence like ChatGPT provide a grim outlook. Three areas are identified as especially critical: Fraud and Social engineering; Disinformation and; Cybercrime, Europol says. The organisation has, after workshops with experts, compiled an overview with recommendations.

“Criminals are typically quick to exploit new technologies and were fast seen coming up with concrete criminal exploitations, providing first practical examples mere weeks after the public release of ChatGPT.”

Read Also:  Is using generative AI moving too fast and risk breaking things?

The following three crime areas are amongst many of concern according to the experts: 

  • Fraud and social engineering: Ability to draft highly realistic text makes it a useful tool for phishing purposes. This can be used to impersonate the style of speech of specific individuals or groups. This capability can be abused at scale to mislead potential victims to trust criminals.
  • Disinformation: The possibility of producing authentic sounding text at speed and scale. This makes the model ideal for propaganda and disinformation purposes, as it allows users to generate and spread messages reflecting a specific narrative with relatively little effort.
  • Cybercrime: In addition to generating human-like language, generative AI is capable of producing code in a number of different programming languages. For a potential criminal with little technical knowledge, this is an invaluable resource to produce malicious code. 

“As technology progresses, and new models become available, it will become increasingly important for law enforcement to stay at the forefront of these developments to anticipate and prevent abuse”, Europol says. 

“Going forward, the universal availability of large language models may pose other challenges as well: the integration of other AI services (such as for the generation of synthetic media) could open up an entirely new dimension of potential applications.” 

Read Also:  Generative AI chatbots and the focus on fact checking

“One of these include multimodal AI systems, which combine conversational chat bots with systems that can produce synthetic media, such as highly convincing deepfakes, or include sensory abilities, such as seeing and hearing.” 

“Other potential issues include the emergence of ‘dark LLMs’, (large language models) which may be hosted on the dark web to provide a chat bot without any safeguards, as well as LLMs that are trained on particular – perhaps particularly harmful – data.”

“Finally, there are uncertainties regarding how LLM services may process user data in the future – will conversations be stored and potentially expose sensitive personal information to unauthorised third parties? And if users are generating harmful content, should this be reported to law enforcement authorities?”

Key recommendations: 

  • Given the potential harm,  it is of utmost importance that awareness is raised on this matter, to ensure that any potential loopholes are discovered and closed as quickly as possible. 
  • Law enforcement agencies need to understand the impact on all potentially affected crime areas to be better able to predict, prevent, and investigate different types of criminal abuse. 
  • Law enforcement officers need to start developing the skills necessary to make the most of generative AI. 
  •  It is critical to engage with relevant stakeholders to ensure that relevant safety mechanisms remain a key consideration that are constantly being improved. 
  • Law enforcement agencies may want to explore possibilities of customised AI trained on their own for more tailored and specific use. 
Read Also:  A lack of staff for booming interest in AI


Moonshot News is an independent European news website for all IT, Media and Advertising professionals, powered by women and with a focus on driving the narrative for diversity, inclusion and gender equality in the industry.

Our mission is to provide top and unbiased information for all professionals and to make sure that women get their fair share of voice in the news and in the spotlight!

We produce original content, news articles, a curated calendar of industry events and a database of women IT, Media and Advertising associations.

    Do you want an experienced opinion on a job issue?
    Moonshot Manager is here to answer!

      Moonshot community sharing thoughts and ideas, in a anonymous, safe environment.