• southsamurai@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    139
    arrow-down
    8
    ·
    4 months ago

    Frankly, good.

    There has yet to be any of these purported “child protection” scams that would do a damn thing for kids, and only invades the privacy of people that have zero reason to be investigated in the first place

    • PumaStoleMyBluff@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      4 months ago

      They could at least do on-device hash lookups and prevent sending. Has zero effect on privacy and does reduce CSAM.

      • southsamurai@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        1
        ·
        4 months ago

        Yah, that would be a great solution in comparison, but it’s still privacy invasive. Not as bad, but it’s still not giving people due process.

        Which, not everywhere in the world recognizes that principle as a right, I am aware. But I do consider due process a right, and scanning anything on anyone’s devices without a legally justifiable reason is a violation of that.

        I’m not willing to kowtow to a moral panic and just ignore the erosion of privacy “because the children”. And it is a moral panic. As bad as it is, as much as I personally would enjoy five minutes alone with someone that’s making or using kiddie porn of any stripe, it simply isn’t such a common thing that stripping everyone of their privacy, in any way is acceptable.

        They wanna figure out a way to target individuals suspected of that kind of crime, awesome. Untargeted, sweeping invasions simply are not acceptable, and I do not care what the purported reason of the week is; kiddie porn, terrorism, security, stopping drugs, I do not care. I have committed no crime, and refuse to give away the presumption of innocence for myself or anyone else.

    • katy ✨@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      22
      ·
      4 months ago

      yeah cracking down on the child trafficking networks operating on telegram would totally not do a thing /s

      • southsamurai@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        18
        arrow-down
        5
        ·
        4 months ago

        It wouldn’t. Anyone into that shit will just go somewhere else, and the price of that is yet another erosion of privacy.

        • katy ✨@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          26
          ·
          4 months ago

          yes if you go after people sharing it on the network you make it harder for them to access it. stop defending csam it’s creepy.

          • southsamurai@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            20
            arrow-down
            3
            ·
            edit-2
            4 months ago

            Look, just because people don’t agree that a specific method will be effective, that doesn’t mean they support it.

            That’s shitty thinking, and even shittier behavior. You should be ashamed of yourself for going there in what was previously a civil, friendly discussion.

    • Lucy :3@feddit.org
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      4 months ago

      Hmm. I think many services just don’t and can’t participate because they’d need to break E2EE. Telegram wouldn’t with most chats.

  • Mrkawfee@lemmy.world
    link
    fedilink
    English
    arrow-up
    116
    arrow-down
    6
    ·
    edit-2
    4 months ago

    When the West wants to censor the internet its always either child protection or national security.thats brought up as the reason.

    • Shiggles@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      6
      ·
      4 months ago

      The west

      Are authoritarian regimes somehow supposed to be more opposed to using children to promote heightened surveillance?

      • Count042@lemmy.ml
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        5
        ·
        4 months ago

        I mean… Yes?

        They don’t need to lie to sell their oppression. They just do it because they’re authoritarian.

        • asdfasdfasdf@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          4 months ago

          LOL, this is a joke right? Authoritarian countries don’t lie about reasons for doing things? LMAO

          • pressanykeynow@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            2
            ·
            4 months ago

            Well when Russia blocked Telegram it was just because Telegram refused to send them their users data. Simple. Now France seem to be sentencing a person to live in prison because Telegram still refuses to send them their users data. But they claim it’s for the children. Same shit, different excuse.

            • asdfasdfasdf@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              4 months ago

              Sure, but what the person I replied to is claiming is that e.g. North Korea doesn’t lie to its people about reasons it does things, which is, of course, bullshit.

              • pressanykeynow@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                2
                ·
                4 months ago

                They didn’t claim it though.

                They said that all governments do some terrible things but in case of the governments that claim they are not authoritarian they pretext those things with something that will make the public not think they are doing a terrible thing.

                In case of restricting Internet freedom or invading other countries it’s usually “but think of children”.

      • ZILtoid1991@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        4 months ago

        Authoritarian regimes also do the same, although often with adult consentual porn instead of CSAM.

          • dubyakay@lemmy.ca
            link
            fedilink
            English
            arrow-up
            11
            ·
            4 months ago

            Ayo. The country that has

            • a stifling work culture
            • zero tolerance porn laws
            • full blown internet censorship
            • chaebols
            • harsh punishment on even the softest of “drugs”
            • miniscule support for new families

            is definitely not authoritarian.

            It’s okay. With the childbirth rate they have currently, they don’t have to worry for much longer. Let’s just squeeze the last out of the current generations.

      • Doorbook@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        4 months ago

        Authoritian regimes doesnt need to pretend. If they find out you are a risk they don’t need to gather evidence to get you in prison, so they don’t need to pretend they care about censoring the internet for the wrong reasons.

        The issue here is the west want to do the same but need a valid justification. Instead of work to stop the actual abuse in the first place they want access to the only way for many people to share information safely.

        You could be technically letrate and find your way around all the restrictions, but many people are not and they need access to secure communication channels to arrange there activism.

        The fact we don’t see backlash against twitter, Facebook, Google, and Apple tells alot about what is this about.

        The fact we are seeing more support for “consent” for kids, and the fact that there were many major cases such as Epstein and Maxwell which has been obscured or even hidden when it comes to major profilic people says alot about their intent.

        • rottingleaf@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          4 months ago

          They do need to pretend, because they need assistance from supposedly civilized states in their actions covered by that pretense.

  • nehal3m@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    77
    arrow-down
    1
    ·
    edit-2
    4 months ago

    Thank you for choosing “Tyranny as a Service!”

    How would you like this wrapped? [ ] Terrorism [X] Child porn

  • parpol@programming.dev
    link
    fedilink
    English
    arrow-up
    63
    arrow-down
    5
    ·
    4 months ago

    If they refused to hand over data that they had about individuals on a warrant, I can see how the arrest was kind of justified.

    If the arrest was for refusing to install a backdoor for law enforcement to spy on anyone they want, then France needs to be kicked out of EU and sanctioned for human rights violations.

  • ikidd@lemmy.world
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    1
    ·
    4 months ago

    The manufacturing consent system seems to be in full swing on this one.

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 months ago

      If there were a good alternative in the sense of public channels that don’t usually get banned, my consent they would get even earlier.

      But the issue is - I don’t even know where to go to discuss shit. Despite TG being full of government trolls.

      • pressanykeynow@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        4 months ago

        Despite TG being full of government trolls.

        That’s the world we live in now. If it’s popular it will be full of trolls, government, corporate, all kind. Just somewhat popular, they are here on Lemmy too.

  • daniskarma@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    4
    ·
    4 months ago

    Did religions joined child protection schemes? Because they are one of the biggest child indoctrination and abuse schemes in the world.

  • nondescripthandle@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    5
    ·
    4 months ago

    Ruling class has been waging war on social media they dont have the ability to backdoor. My guess is they’d come for signal too if they didn’t use it themselves.

    • pressanykeynow@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      4 months ago

      The government don’t usually need the text from your conversations, just the metadata who the person talks to, their location, etc. Signal is a US company, they surely provide all that data. It seems Telegram didn’t.

      • Lowpast@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 months ago

        Signal does not. https://signal.org/bigbrother/santa-clara-county/

        Tl;dr: Signal gave the court timestamps for three out of nine phone numbers that the court demanded data on. The timestamps were the dates three phone numbers last registered their accounts with Signal. That’s it. That is all the data there was to give.

        This is why I use Signal. This is why I donate monthly to Signal.

  • sumguyonline@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    2
    ·
    4 months ago

    “Schemes” it’s as if they know they aren’t actually protecting anyone… Like they would just let anyone torment their children if they claimed religious protections and offered a big enough bribe(I know for a fact that is how it actually works). But sure, telegram is the problem not fed bastards hunting innocent people because the bad people bribed them to leave them alone. Be a 1% or be investigated when you don’t cow toe to the 1%. Your choice apparently.

  • yetAnotherUser@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    ·
    4 months ago

    The BBC contacted Telegram for comment about its refusal to join the child protection schemes and received a response after publication which has been included.

    Where is it? I didn’t find it anywhere in the article.

  • katy ✨@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    20
    ·
    4 months ago

    imagine going to jail just because you refused to address the child abuse and csam on your own network.

      • katy ✨@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        4 months ago

        meta does get pointed out but not as much since meta actually does things to combat csam.

        twitter gets called out ALL the time mostly because elon himself is intervening to reinstate people who share csam because he firedall the trust and safety teams.

      • HauntedCupcake@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        4 months ago

        Huh, it’s maybe as if, nooooo… it couldn’t be, the Zuck and Elon are my trusted friends, my confidants, they wouldn’t. They couldn’t. No way in hell they’d sell or otherwise compromise my personal data

  • Mac
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    25
    ·
    4 months ago

    Telegram: “Man, fuck them kids bruh!”

    • JigglySackles@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      1
      ·
      4 months ago

      Those programs are about mass surveillance and are wrapping themselves in the sheep wool of “protecting kids”

        • pressanykeynow@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          4 months ago

          Why should they? Should every mail(physical or not) you receive be opened and read? Should the government have access to everything you do on your phone or pc? Should the government moderate your house? You are full 1984.

          • atrielienz@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            edit-2
            4 months ago

            Even Facebook doesn’t allow CSAM in public profiles. You can’t just pull up Facebook and see that on your regular feed. Closed groups are a different story. Why should this be different?

            Mind you I’m not saying that the CEO should be criminally responsible for what users on the platform post. I’m pointing out that moderation is a thing even on some of the worst offenders in the space.

            • pressanykeynow@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              4 months ago

              You didn’t answer my questions.

              What moderation do you want? And how would you prevent “moderation” from becoming censorship?

              Aren’t there people whose job is to prevent crimes? Why some IT person who has no idea of crime need to do their job?

              • atrielienz@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                4 months ago

                Because your questions aren’t germane to the point I was making. In fact the first question “how would you prevent " moderation” from becoming censorship" is literally answered by my second comment. Facebook already does this with Facebook messenger. But even if they didn’t, Signal has functions to allow encryption.

                So what you’re saying is, criminals who aren’t using encryption (on a platform where encryption features are readily available) don’t deserve to be moderated on a platform where their messages are using a company’s cloud bandwidth. Does the company not have rights? And if we agree that the company has rights then they also have to follow the law.

                Yes there are people who’s jobs are to (not prevent because police and policing is reactionary not preventative) investigate, and try criminals in a court of law for crimes). This was a poor question to ask. You’re literally acting like we don’t employ thousands of people over various social media and messaging platforms to review and moderate things like CSAM.

                The gist for me is criminals gonna do criminal things but at the end of the day these are our public spaces and just because I don’t want to be surveilled in public or live in a police state doesn’t mean that I want criminals not to be prosecuted for crimes they commit just because someone cares more about their bottom line than they do about moderation of a messaging platform they provide to the public.

                We aren’t talking about end to end encrypted messages here. We’re talking about messages with no such encryption that can be viewed by anyone. There are literally public groups being used by Terrorist organizations on Signal. And while Signal has repeatedly refused to give up encryption keys for the ones that are using encryption (as they should), any criminal that isn’t is not protected by it and should be moderated.