• Snot Flickerman
    link
    fedilink
    English
    011 months ago

    Services that “listen” for commands like Siri and Alexa have to be, by default, always listening, because otherwise they would not be able to hear the activate command. They are supposed to dump the excess data like anything that came before the activation command, but that’s just a promise. There are very few laws protecting you if that promise turns out to be a lie. The best you can get is likely small restitution through a class action lawsuit (if you didn’t waiver right to that by agreeing to the Terms of Service, which is more often than not, now).

    Of fucking course they’re listening.

    • @Serinus@lemmy.world
      link
      fedilink
      1111 months ago

      They’re not. Not yet. People are on edge and looking for this exact thing, which hadn’t happened yet. Meanwhile, they’ve already built a pretty damn good profile of you based on your search queries and mistyped urls.

    • @null@slrpnk.net
      link
      fedilink
      1111 months ago

      They are supposed to dump the excess data like anything that came before the activation command, but that’s just a promise.

      Where are they hiding that data locally, and how are they making it invisible in transit?