I wanted to maybe start making PeerTubr videos, but then I realized I’ve never had to consider my voice as part of my threat model. A consequence that immediately comes to mind is potentially having your voice trained on by AI, but I’m not (currently) in a position where others would find it desirable to do so. Potentially in the future?

I’d like to know how your threat model handles your personal voice. And as a bonus, how would voice modulators help your voice in/prevent your voice from being more flexible in your threat model? Thanks!

  • AnAmericanPotato@programming.dev
    link
    fedilink
    English
    arrow-up
    9
    ·
    8 hours ago

    I’m not (currently) in a position where others would find it desirable to do so. Potentially in the future?

    It’s hard to imagine a scenario where this would happen and your voice would not otherwise be available. For example, if you went into politics, then you’d be a target, but you’d already be speaking in public all the time. It only takes a few seconds of a voice sample to do this nowadays and it’ll only get easier from here.

    Maybe just make a point to educate your family and friends on the risk of voice cloning so they don’t fall for phone scams.

    • grehund@beehaw.org
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      4 hours ago

      Create a secret passphrase - that only your family knows - that can be used to verbally verify it’s the real you and not a scam caller. Bonus points: create an alternate passphrase that can be used to signal that you’re under duress.

    • utopiah@lemmy.ml
      link
      fedilink
      arrow-up
      4
      ·
      7 hours ago

      educate your family and friends on the risk of voice cloning so they don’t fall for phone scams.

      Absolutely, in fact you can easily (clone your own voice, create a new email address like LGTM_butnotop@discuss.tchncs.de and attach the recording where you ask for a Netflix/Apple/whatever gift card) do it as a harmless prank just to gauge how they’ll react.