1300 669 711 - Australia

AI Voice Scams in Australia

random user

Cybertrace Team

June 23, 2024 · 8 min read

Share On

AI voice scams in Australia are more prevalent than ever, with mums and even elderly people falling victim, resulting in a staggering loss of $568 million. Have you been a victim of these scams? Do you want to fight back against the scammers? While they may be experts in orchestrating these scams, there is still hope in holding them accountable.

What is an AI Voice Cloning Scam?

Visualisation of an Artificial Intelligence fraud.

An AI voice cloning scam is a type of fraud where criminals use artificial intelligence to create highly realistic replicas of someone’s voice. These voice clones are generated using advanced generative AI technologies that can mimic the tone, pitch, and cadence of a person’s speech after analysing just a few seconds of their audio. Scammers typically use these cloned voices to deceive victims by impersonating friends, family members, or even public figures, tricking them into providing sensitive information or making financial transactions under false pretences​.

Additionally, the effectiveness of AI voice cloning scams lies in their ability to create a convincing audio representation of the target’s voice, making it difficult for the victims to distinguish between the real and fake voices. This technology relies on publicly available voice samples, often sourced from social media, videos, or recorded phone calls, which are then used to train the AI models to replicate the voice accurately. As a result, these scams can create a sense of urgency or emotional manipulation, leading to substantial financial and emotional harm for the victims​​.

Finally, these scams are becoming increasingly advanced and widespread, with cases reported worldwide. The ease with which these voice clones can be created poses significant challenges for individuals and organisations in maintaining security and trust in voice communications.

How Many People have Been Scammed by an AI Voice in Australia?

AI voice scams are becoming a major concern in Australia, mirroring trends observed globally. In 2023, Australians reported an average of 1,500 scam cases per month, with a large portion involving some form of impersonation, including AI voice scams. These scams often use advanced AI technology to clone voices using minimal audio samples, making it difficult for victims to discern between genuine and fraudulent calls​​.

Experts predict that AI voice scams will increase in 2024 as scammers use this advanced technology to manipulate unsuspecting victims. Such scams have already been widespread in countries like the UK and the US, and are expected to become more common in Australia. The AI can recreate a voice with high accuracy from a brief recording, enabling scammers to impersonate loved ones or trusted figures convincingly​.

How does an AI Voice Scam Work?

Artificial Intelligence copying a human.

AI voice scams involve scammers using advanced tools to clone a person’s voice from minimal audio data found online. Once replicated, the cloned voice is used to impersonate someone familiar to the victim, manipulating emotions to trick them into urgent financial actions, leading to substantial losses.

Cloning Process

Scammers use advanced AI tools to analyse and replicate a person’s voice characteristics. This process requires minimal audio data, often as little as three seconds, to create a realistic voice clone. The ease of access to voice data from social media posts, YouTube videos, and other online content makes it simple for cybercriminals to gather the necessary samples​.

Impersonation

The cloned voice is then used to impersonate someone the victim knows, such as a family member, friend, or even a trusted authority figure. The scammer usually fabricates a distressing scenario, like a car accident or an urgent financial need, to manipulate the victim into sending money quickly. The authenticity of the cloned voice makes these scams particularly convincing and hard to detect​.

Emotional Manipulation

These scams are highly effective due to the emotional manipulation involved. Scammers create a sense of urgency and pressure the victim to act immediately, often preventing them from taking the time to verify the caller’s identity. The impersonated voice’s emotional pleas can lead victims to make hasty decisions, resulting in significant financial losses​.

How to Avoid an AI Voice Scam

A person avoiding artificial intelligence fraud.

To avoid falling victim to AI voice scams, it’s important to implement several protective measures:

Verify Caller Identity

  • Ask for details: Request the caller’s name, position, and contact information, then verify these details through official channels.
  • Personal questions: Ask questions only the real person would know the answers to.
  • Call back: Hang up and call the person’s known number directly to confirm the situation.

Use Secure Communication Practices

  • Secret code: Establish a secret code word or phrase with family members that must be used in emergencies.
  • Alternative contact: Contact another person in common to verify the caller’s identity.

Be Sceptical of Unsolicited Calls

  • Caution: Treat unsolicited calls with scepticism, especially those asking for urgent action or sensitive information.
  • No immediate payments: Avoid making immediate payments, particularly via unconventional methods like gift cards or wire transfers.

Technology Tools

  • Call blocking: Use call-blocking features on your phone to filter out suspicious numbers.
  • Check caller ID: Be wary of unfamiliar numbers and avoid trusting caller ID alone, as it can be spoofed.

Preventative Measures

  • Limit voice data: Be cautious about sharing voice recordings or personal information online.
  • Privacy settings: Regularly review and adjust privacy settings on social media to limit access to your personal data.
  • Voice authentication: Use voice authentication or biometric security where available.

What to do if you’ve been Scammed in an AI Voice Scam?

A private investigator investigating an AI scam

Cease Communication

 Immediately stop all communication with the scammer. Do not provide any more information or make any further transactions.

Contact Your Bank

Inform your bank or financial institution about the scam, especially if you have shared any banking details or made payments. They can help to monitor your accounts for suspicious activity and possibly reverse any fraudulent transactions​.

Monitor Your Accounts

Keep a close watch on your financial statements and online accounts for any unauthorised activities. Report any suspicious actions to your financial institution immediately.

Change Password

Update passwords for all your online accounts, especially those linked to sensitive information or financial transactions. Use strong, unique passwords and consider enabling two-factor authentication.

Report to Scamwatch

Report the incident to Scamwatch, Australia’s official scam reporting site, to help authorities track and combat these scams.

Contact Cybertrace

Reach out to Cybertrace, a firm specialising in scam investigations. We can assist in tracking the scammers and gathering evidence to support your case.

If you have suffered substantial financial loss or your personal information has been misused, consider consulting with a legal advisor for additional support and potential recourse.

Example of AI Voice Scams

Representation of an AI (artificial intelligence) scam

AI voice technology is increasingly used for scams like impersonating political figures to influence elections or mimicking friends’ voices to solicit urgent funds, taking advantage of emotional appeals. Additionally, scammers employ AI-generated voice clones in bank fraud and business email compromise scams to deceive victims into sharing sensitive information or transferring money, using trust and familiarity.

Political Scams

During the 2024 US presidential primaries, robocalls using AI-generated voices impersonated President Joe Biden, urging New Hampshire voters not to participate in the primary election. This scam aimed to manipulate voter behaviour using a highly convincing voice clone of the President

Impersonating Family Members

In a case from Ontario, a man was scammed out of $8,000 when he received a call from someone who mimicked his friend’s voice, claiming to be in a serious accident and needing money urgently. The emotional appeal and urgency made it hard for the victim to verify the authenticity of the call before transferring the money​

Ontario man loses $8K in AI phone scam using friend’s voice

Bank Fraud

AI-generated voice clones are used to impersonate bank representatives, convincing victims to share sensitive information such as account numbers and passwords. This type of scam often involves unsolicited calls where the fraudster uses the cloned voice to gain the victim’s trust​.

Lloyds Bank logged into using AI voice 

Business Email Compromise (BEC)

Scammers use AI voice technology to impersonate senior executives or business partners during phone calls or video conferences. The goal is to trick employees into transferring funds or sharing sensitive information. Such scams exploit the authority and familiarity of the cloned voice to bypass normal security protocols

How scammers can use ‘deep voice’ AI technology to trick you | About That

Related topic: Unmasking Deception: Elon Musk, Australian PM Deep Fake Scam

Contact Us

If you’ve been a victim of an AI voice scam or need help protecting against them, contact Cybertrace for an expert investigation. We’re here to help you stay safe and secure.

Questions for the Readers

Have you or someone you know been affected by an AI voice scam? How did you handle the situation?

Leave a Reply

Your email address will not be published. Required fields are marked *

Latest Post

A person taking a proactive approach in protecting himself againts a tax refund scam
Tax Refund Scams in Australia

With tax season coming in hot, tax refund....

Read more
claim-refund scam website - recovery scam site screenshot
Claim-Refund.com – Recovery Scam Alert

What is a Recovery Scam?  A recovery scam,....

Read more
Scams on Discord 2024
Scams on Discord [2024]

Scammed on Discord? As of January 2024, Discord....

Read more

Contact Us

Contact our friendly staff at Cybertrace Australia for a confidential assessment of your case. Speak with the experts.

Email icon Email: [email protected]
Phone Icon International +61 2 9188 7896