• cum@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    9
    ·
    10 months ago

    Uh there’s zero chance these big techs are selling voices like this. Also, this sounds very targeted and planned, so there must be more context to this. Also, why the hell are they on bluesky?

  • voxel@sopuli.xyz
    link
    fedilink
    arrow-up
    28
    ·
    10 months ago

    you don’t even need to fake a voice for these scams tho, it’s very difficult to differentiate q voice while you’re crying

  • Zeshade@lemmy.world
    link
    fedilink
    arrow-up
    24
    ·
    10 months ago

    Do a lot of people put their voice on the internet “as much as they’re able to”? It sounds like that person may post their voice online more than the average person…

    • Powerpoint@lemmy.ca
      link
      fedilink
      arrow-up
      5
      ·
      10 months ago

      Discord just automatically started putting you opt in for having your voice recorded for clips

    • maynarkh@feddit.nl
      link
      fedilink
      arrow-up
      3
      ·
      10 months ago

      I imagine even discounting social media self-posts, there are VoIP calls, etc.

      Don’t assume a call with your mom through Facebook Messenger or Zoom or FaceTime or whatever is not somehow packaged and sold.

    • Szymon@lemmy.ca
      link
      fedilink
      English
      arrow-up
      40
      arrow-down
      2
      ·
      10 months ago

      You can train AI with just a single voice clip. You can do this on your desktop. Microsoft doesn’t need to sell shit, you put that clip on tiktok yourself.

        • tiramichu@lemm.ee
          link
          fedilink
          arrow-up
          18
          ·
          10 months ago

          The ‘old’ way of faking someone’s voice like you saw in 90s spy movies was to get enough sample data to capture each possible speech sound someone could make, such that those sounds can be combined to form all possible words.

          With AI training you only need enough data to know what someone sounds like ‘in general’ to extrapolate a reasonable model.

          One possible source of voice data is spam-calls.

          You get a call, say “Hello?” And then someone launches into trying to sell you insurance or some rubbish, you say “Sorry I’m not interested, take me off your list please. Okay, bye” and hang up.

          And that is already enough data to replicate your voice.

          When scammers make the call using your fake voice, they usually use a crappy quality line, or background noise, or other techniques to cover up any imperfections in the voice replica. And of course they make it really emotional, urgent and high-stakes to override your family member’s logical thinking.

          Educating your family to be prepared for this stuff is really important.

        • Szymon@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          10 months ago

          Yeah I’m gonna go ahead and not give that knowledge out.

      • brrt@sh.itjust.works
        link
        fedilink
        arrow-up
        5
        ·
        10 months ago

        You don’t even need to upload anything. They can call you, have a short convo and then just say “oh sorry wrong number” or something. You’d never know.

        • SomeGuy69@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          10 months ago

          Yup. You need like 5 to 15 seconds of talking, that’s it. I’ve done this myself to confirm it works actually quite well with.

      • unexposedhazard@discuss.tchncs.de
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        10 months ago

        Well they said they dont share their voice anywhere, if thats true it would be concerning. I for one just dont use any centralized unencrypted services that could scrape my voice but i would assume most people think that if they dont publish anything, they are safe…

  • abbadon420@lemm.ee
    link
    fedilink
    arrow-up
    20
    ·
    10 months ago

    When I was a kid, my parents had “the talk” with me. It was about sex. Now I’m older and my parents are too. I have to have “the talk” with them. It’s about scams.

  • ɔiƚoxɘup@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    14
    ·
    10 months ago

    All those TV shows that taught us how to spot which twin was the evil one by asking about life history were just training us to beat AI

  • DirkMcCallahan@lemmy.world
    link
    fedilink
    arrow-up
    11
    arrow-down
    1
    ·
    10 months ago

    Waiting for the comment that’s going to say something like, “Joke’s on you, my parents don’t even talk to me.”

  • rbesfe@lemmy.ca
    link
    fedilink
    arrow-up
    14
    arrow-down
    5
    ·
    edit-2
    10 months ago

    The only way to train an AI voice model is to have lots of samples. As scummy as they are, neither Microsoft nor Apple is selling your voice recordings with enough info to link them to you specifically. This person probably just forgot about an old social post where they talk for enough time for a model to be trained. Still super scary stuff.

    • altasshet@lemmy.ca
      link
      fedilink
      arrow-up
      22
      arrow-down
      1
      ·
      edit-2
      10 months ago

      Not true anymore. You can create a reasonable voice clone with like 30 seconds of audio now (11labs for example doesn’t do any kind of authentication). The results are good enough for this kind of thing, especially in a lower bandwidth situation like a phone call.

    • Wirlocke@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      10 months ago

      True for creating voices at all, but that work has already been done.

      Now we’re just taking these large AI’s trained to mimic voices and giving them a 30 second audio clip to tell them what to mimic. It can be done quickly and give convincing results especially when hidden by the phonecall quality.

  • paulcdb@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    ·
    10 months ago

    Tbh it’s not that hard to stop scams. Treat EVERY call you get as a scammer!

    Either phone back on a known number, not some shit they give you or if they claim you need bail, ask for a reference number and the place being held and phone them after looking up the number, and If they get pissed, it’s a scam!

    No real police force is going to care/shouldn’t care if you call back. It’s not like cops get a percentage of bail money but scammers always seem too desperate to get you to pay and lose it pretty quick.

  • M500@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    ·
    10 months ago

    I advise everyone to contact their loved ones and inform them of this possibility. I also advise having some codeword that would be used if there was an emergency and money needs to be sent.

    For example is more than $100 is being asked for we have to share the code word or we should not transfer money.

  • Tnaeriv@sopuli.xyz
    link
    fedilink
    arrow-up
    2
    ·
    10 months ago

    They don’t even have to scrape calls. If you use Alexa, Siri or Google Assistant, they already have recordings of your voice on their servers