Sunday, October 13, 2024
    -Advertisement-

    The bank warns: Millions of people could be victims of financial fraud

    “Millions” of people could fall victim to scams that use artificial intelligence to clone their voice, a British bank has warned.

    The survey also showed that 46% of respondents weren’t aware that such scams existed, and that 8% would send over as much money as requested by a friend or family member, even if they thought the call seemed strange.

    - Advertisement -

    The scammers can then identify the person’s friends and family members and use the AI-cloned voice to make a phone call asking for money, according to CNN.

    These types of scams have the potential to “entrap millions,” Starling Bank said in a press release Wednesday.

    - Advertisement -

    Hundreds have already been affected

    According to a survey of more than 3,000 adults carried out by the bank with Mortar Research last month, more than a quarter of respondents said they had been the target of an AI voice cloning scam in the past 12 months.

    The survey also showed that 46% of respondents were not aware that such scams exist, and that 8% would send as much money as a friend or family member asked, even if they thought the call was strange.

    “People regularly post content online that has recordings of their voice, without realizing that doing so makes them more vulnerable to fraudsters,” Lisa Graham, 

    The bank is encouraging people to agree with their loved ones on a “safe phrase” – a simple, random phrase that’s easy to remember and different from other passwords – that can be used to verify their identity over the phone.

    It is advised not to share the passphrase over messages, which could make it easier for scammers to find out, but if shared in this way, the message should be deleted when the other person sees it.

    As AI becomes more adept at imitating human voices, there are growing concerns about its potential to harm people, for example by helping criminals access their bank accounts and spread misinformation.

    Earlier this year, OpenAI, maker of the generative AI chatbot ChatGPT, unveiled its voice replication tool, Voice Engine, but did not make it available to the public at that stage, citing the “potential for synthetic voice abuse”.


    Discover more from MegaloPreneur Magazine

    Subscribe to get the latest posts sent to your email.

    - Advertisement -

    For the latest updates and news, follow The MegaloPreneur Magazine on Google News. To show your support for The MegaloPreneur Magazine, click here.

    News Room
    News Room
    Story byline by the MegaloPreneur News Room – a group of passionate journalists and editors dedicated to bringing you reliable and engaging news stories.

    Discover more from MegaloPreneur Magazine

    Subscribe now to keep reading and get access to the full archive.

    Continue reading