• Electricd@lemmybefree.net
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    6 hours ago

    That’s known. Siri data is kept for improving the models through human labeling. It’s not like it was hidden, just read the damn privacy policy.

    If that’s your magical source as an insider, I’m sorry, but you’re bullshitting. That didn’t prove anything you said too

    It’s not spying as it wasn’t their goal. It sure is shit, but you can’t compare that to the stuff Microslop and Google do

    • toad@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      6 hours ago

      That’s not the problem. With the false positive they were hearing people during everyday interactions. I remember my colleague bothered by the fact they were hearing people having sex, talking about drugs, all the while with personal information written on screen.

      Do you want some guy in Apple headquarter hears some random snippet of your life because you pronounced the word “Shiny” and the model messed up?

      • Electricd@lemmybefree.net
        link
        fedilink
        arrow-up
        1
        ·
        5 hours ago

        I disabled that voice activation feature for this exact reason, but yea, what’s shitty is that people had not been clearly informed at all