I’m never putting one of these in my home.

  • EnderMB@lemmy.world
    link
    fedilink
    English
    arrow-up
    45
    arrow-down
    2
    ·
    1 year ago

    I work for Amazon.

    This has been the case for many years. Amazon has used AI in Alexa and other services for many years as primary providers, and has told it’s users it’s used it’s data for as long. We’re talking from close to inception here, so 6-7 years, at least. Hell, LLM’s aren’t even new to most big tech companies!

    I’m all for privacy, but if you want privacy then you probably shouldn’t have a fucking tin can in your house that actions every conversation to a cloud service!

        • EnderMB@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          3
          ·
          1 year ago

          Considering I set up one of the content types that relates to wakeword and utterance text analysis for Alexa, I trust it completely.

          • Duamerthrax@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            6
            ·
            edit-2
            1 year ago

            But can I trust you? Are you willing to share the source code?

            Edit: Tell me why I’m suppose to trust an internet rando?

            • EnderMB@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              1 year ago

              You’re right to be distrustful, but there’s a fine line between a healthy distrust of a closed ecosystem and blind worry/cynicism.

              Obviously I’m not going to share proprietary source code. Even if I did, it would mean very little without knowing the upstream and downstream services. What I will say is that Amazon is at least honest about what it’s services do, even if it’s in the fine print. Customers are able to delete their data when they choose to, and if they do, there are serious (internal) consequences when stuff like data deletion and DSAR aren’t followed.

              • Duamerthrax@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                2
                ·
                1 year ago

                Also, it would very little without also inspecting every chip on the board. You could have easily written safe code, but the audio signal could also be intercepted before it gets to that point.

                Alexa doesn’t solve any problems and only exists to make consumption easier. It’s not something I need to trust because it’s not something I or anyone else needs.