• Hodor@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    65
    arrow-down
    1
    ·
    3 days ago

    Except now they record your voice and use it to train voice ai and scam you harder. My coworker’s ex-husband got a call from their “daughter” distressed “kidnapped” needing money for ransom. Sent it and called the ex-wife. Daughter was sleeping at home.

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      3 days ago

      I wonder if they do. That seems like a lot of effort to go to for the average person for a scammer.

      It seems easier to have a generic voice, rely on the fact that phone audio quality isn’t great to bridge the gap, and use a shotgun approach.

      Some places do, since there were a few high profile attacks, but they were nearly all targeting organisations by pretending to be the CEO or something.

      • TehWorld@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        3 days ago

        Once it’s automated it’s the same either way. Probably something even vibe code could pull off.

      • SaharaMaleikuhm@feddit.org
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 days ago

        I still have an ace up my sleeve: I don’t pick up the phone unless I know who is calling or am otherwise expecting a call.
        Right now I just get the occasional one liner email: “hey Sahara what are you doing tonight?” Who the hell falls for that?