• AwkwardLookMonkeyPuppet@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    7 months ago

    Anything is legal when you force the customer to agree to it or not use your product. They can say whatever they want in the ToS because it’s 365 pages long and only attorneys can understand what is actually being said.

      • kakes@sh.itjust.works
        link
        fedilink
        arrow-up
        18
        arrow-down
        2
        ·
        7 months ago

        And based on their track record, they will just quietly turn it back on.

        Microsoft is so far beyond the benefit of the doubt they couldn’t get back to it if they tried.

        • efstajas@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          edit-2
          7 months ago

          Are there actually any documented cases of them just enabling userland features after they’ve been disabled? The only thing I heard of before was registry edits / telemetry changes being undone. Not to say that that’s cool of course, but at least it’s not like it asks you for your privacy settings during startup and then undoes your choices. As far as I know, maybe I’m just out of the loop.

          Generally though, what do you think would actually be Microsoft’s motivation to randomly re-enable this particular feature? Do you think that the claim that the data doesn’t leave the device is a lie?

          • kakes@sh.itjust.works
            link
            fedilink
            arrow-up
            6
            ·
            7 months ago

            Does it get much worse than telemetry settings being quietly enabled? It’s spyware at the best of times, much less when they get all sneaky about it. And I’ve definitely had them change privacy/telemetry options that I set on startup, multiple times.

            I don’t necessarily think they’re stupid enough to come out with the full data harvesting machine on day one. They’ll release what they can get away with - in this case, taking screenshots and storing them locally - and they’ll boil the metaphorical frog from there. Maybe they offer more powerful AI by running it through their servers, and then they can start “accidentally” opting people into that “service”.

            I’m not even necessarily saying there’s some grand scheme going on here, but nobody can possibly deny they have every incentive to push that boundary until it breaks, and they have consistently shown that they will pursue that incentive without any regard for user privacy whatsoever.

            We know this because they have done it so many times before.