Generative AI may be an effective tool for businesses, especially those that need presentable content churned out quickly. However, it comes with its downsides – a tool that’s so effective at creating content is a dream come true for scammers. They rely on fooling you into thinking that what you see and read is authentic. AI makes that “authenticity” possible, which means AI makes scams more dangerous.
How?
Scam Detectors Most Trusted Websites in Online Security
- Guard.io (100): Protect your digital world on any device – Guardio stops scams and phishing in their tracks.
- Incogni.com (100): Delete your personal data from the internet and protect against scams and identity theft.
- ExpressVPN (100) Stay secure and anonymous online - Best VPN Out There
We explain how AI makes scams more dangerous in several ways.
Issue 1 – Improving Phishing Attacks
Phishing emails are often easy to spot for those aware of the signs. Spelling and grammatical errors you’d never find in an authentic email are often present in phishing emails, as are formatting issues and the use of incorrect names.
Large language machines (LLMs), which are the powerhouses behind generative AI models, help scammers avoid these issues. They absorb real-time information from company websites, enabling scammers to create phishing emails that look convincing and don’t bear the hallmarks of emails generated without AI.
For instance, check out this prompt and the result gained from using Gemini to create an email from Amazon:
You’ll see what looks like a legitimate Amazon email, with correct formatting and even prompts to include your order number and additional details where required. Scammers can use tools like Gemini to create more convincing phishing emails than ever before, making their scams more dangerous.
Issue 2 – Removing Accents With Voice AI
AI tools are becoming increasingly capable of both cloning other people’s voices and removing the accents from voices so they sound “native.” The latter use is ideal for individual users who want to work on their linguistic skills. However, removing accents from voices is also perfect for scammers.
A scammer can use AI voice cloning – such as that provided by Voice.ai – to create robocall scripts using voices that sound local to their intended victims. Using this technology, a scammer can create a full AI voice using a 30-second snipper pulled from social media. The technology is often used in the family emergency scam. One of your family members receives a call late at night, supposedly from either you or the emergency services, telling them that you’ve been in an accident. The scammer pushes for money to be sent immediately to help you, with a clone AI voice – perhaps generated using clips you’ve posted online – making the scam more effective.
Choosing a code word for all family members to use is a good way to avoid this scam.
Issue 3 – AI Used in Dating Scams
Catfishing and lonely-heart scams have always been common. With AI, scammers have several new tools at their disposal. They can use AI to alter images or even create them from scratch, allowing the scammers to essentially “create” entire people designed to attract your attention. The same generative AI used to create more “authentic” content for phishing emails can also work wonders for a scammer here – it’s used to create bios and even carry out full conversations with the scammer’s targets.
AI makes scams more dangerous because it helps scammers remove the warning signs that savvy daters would normally pick up on. A bot account seems much more real when an AI is able to respond to questions and comments you make in a conversation using context clues.

TOP 4 MUST-WATCH FRAUD PREVENTION VIDEOS
1. Top 5 Amazon Scams in 2024 2. Top 5 PayPal Scams in 2024 3. How To Spot a Scam Email in 2024When my sweet old grandmother got caught up in an Amazon gift card scam, I decided then and there that I needed to do whatever I could to inform as many people as possible about the grifters of the world. That’s what I do here – writing about modern scams so you don’t get caught out.