• Lvxferre
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    7 months ago

    It disgusts me that, nowadays, the mere presence of this “think on the children!” discourse immediately makes me think “which instance of censorship is being hidden behind that discourse?”. Shitty people have been abusing the child safety discourse for so long that people are getting desensitised to it.

    EU’s chat control:

    1. Is a privacy hell. It gives corporations free reign to scan the content being sent. They will not stop at “not CSAM, move on”; the means and the excuse to use the content of your messages to profile you for advertisement are there, they will use it.
    2. Violates a basic legal principle, called presumption of innocence. It’s expecting you to prove to not be sharing CSAM.
    3. Employs inaccurate techniques. Machine learning is great for things where one or another mistake, like pointing to a puppy pic and saying “it’s a cat” is not a big deal; the same is not true when you’re dealing with people. You could literally wreck someone’s life because that dumb bot mistakenly labelled something that is not CSAM as CSAM.
    4. Punishes people for being tech-savvy enough to see the problems. It’s yet another violation of the presumption of innocence: if you don’t want Meta to vulture on your data, you’re assumed to be a paedophile, so you’re prevented from sharing links, video and images???
    5. Is clearly non-effective. How much do you bet that the fuckers sharing CSAM will simply allow the scanning, so they can share links, and the links that they’re going to share will be to .zip files protected by passwords? They probably already do this.

    chats of employees of security authorities and the military are also to be exempted from chat control.

    They’re basically acknowledging that in some situations, that have nothing to do with CSAM, it’s completely reasonable to not want to have your privacy violated by the proposed law. And they’re still trying to pass it. /facepalm