I hate having to make this post. I enjoy the Matrix software, I enjoy the fact that it’s open source, developed by a non-profit, and that the protocol is completely original rather than built on top of existing standards. The layout of Element, the flagship client, is great, and I have met a great amount of enjoyable people on the protocol. However, what I saw from the Matrix admin the other day was straight up abysmal and shockingly ineffective with their moderation, and their prolonged lack of action has me debating if I should move away from Matrix as a whole considering this happened under the flagship Matrix.org instance in what is their official and second-biggest room.

So about two days ago or so, I was seeing that there were multiple notifications coming from the Matrix HQ room, of which I had the notifications for all messages turned on. I decided to take a look as to what was being talked about there, and the chat was being inundated with images of CSAM by two accounts. While it did take a bit of time for someone to ping the admin account, the resulting response, or rather, lack of response, was simply unacceptable by any stretch of the imagination.

It took over an hour and a half of these two accounts constantly sending images of CSAM for the admin account to step in and ban the accounts, and delete the images in question. In this time the admin account was pinged nine times, with moderators of the chat also all being pinged at once in one message. After the accounts posting the CSAM were banned, it took at least an extra two hours for all the pictures of CSAM to be deleted, as the admins left some photos undeleted before users started notifying them that they still remained.

This is the official Matrix instance hosted by the Matrix Foundation. Assuming that the admin account operates off the same timezone the Matrix Foundation is based in, this all happened in the middle of the day around noon. This is the second biggest room on the official Matrix server, and likely, the second biggest room on the entire Matrix protocol with over 60K members. This room is often people’s first introduction to using Matrix, and for an hour and a half visitors to the room would be greeted with CSAM, and for an extra two hours, would be greeted with it if they were to simply scroll up in the message history.

This is beyond unacceptable. While I understand efforts to prevent CSAM from being posted to begin with can be an issue considering Matrix may not have the same tools to detect that content and delete it automatically, the fact that there was no action from Matrix moderation in their official room on their official instance for such an extended amount of time is inexcusable. I’ve since left the Matrix HQ room out of sheer disappointment, and am now considering if I should swap to using XMPP.

I really hope that the moderation on the official Matrix instance, especially within their official space, improves in the near future, but it honestly the fact that this kind of thing can happen this late into the room’s existence already speaks volumes to me about how ineffective the moderation likely has been prior to me using Matrix on a near daily basis, and while I understand that the protocol is decentralized and that this doesn’t reflect the administration of all instances, rooms, and spaces, it really has me questioning if I should be utilizing and/or supporting a protocol where the people behind it are stunningly ineffective at dealing with some of the most heinous, vile things someone can do or see.

    • Binzy_Boi@piefed.socialOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      2 months ago

      As the other comment said, it’s less used than Matrix, but it’s moreso I’ve heard that XMPP, unlike Matrix, doesn’t really have their own official chatrooms directly managed by XMPP Standards Foundation.

      Whether that’s true or not, I’m not quite sure, I still need to do some exploring with XMPP and see for myself, but that being the case alone would prevent a situation like with Matrix where this kind of content is being posted under their direct supervision.

      Again, I’m reiterating just to make sure I’m being clear, but it’s not an issue for me if there’s going to be CSAM on the protocol or even on an official server for it, but I’d be uncomfortable using a protocol if that kind of content was being posted and not properly dealt with in official chatrooms.

      • nis@feddit.dk
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        2 months ago

        That is just “less spam by obscurity”. If that is what you want, your best option would be to write your own chat application and be the only one using it.

        The fact that bad actors can misuse X is not really a good argument for not using X appropriately, IMO.

        I don’t even know who moderates that room. It was not the first Matrix room I joined, actually I don’t think I’ve ever been in it. But as you find the moderation lacking, you could volunteer to be a part of the moderation team. Be the change you want to see in the world :)

      • xionzui@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        2 months ago

        If your only issue is with the official server, why would you not just use matrix but join a different server?

        • Binzy_Boi@piefed.socialOP
          link
          fedilink
          arrow-up
          1
          ·
          2 months ago

          Feel free to disagree, but it’s a bit of a moral thing for me as to whether I’d be comfortable utilizing tools developed by a group with issues like this.

          As said in another comment, I’m already on a separate server, that being tchncs. I just personally wouldn’t like to see my usage of the protocol possibly being seen as me being content or indifferent to these heavy moderation issues in their official rooms.

          • xionzui@sh.itjust.works
            link
            fedilink
            arrow-up
            3
            ·
            2 months ago

            I’m not sure I see poor performance by volunteer moderators on a service tangential at best to the development of open source server software as a moral issue with that software in any way. But as you said, you’re free to disagree.

  • krolden@lemmy.ml
    link
    fedilink
    arrow-up
    7
    ·
    2 months ago

    60k users

    Sorry are you expecting a few mods to constantly monitor a chat with more people than the entire userbase of lemmy? How would this be any different on any other platform with so many users?

    • Binzy_Boi@piefed.socialOP
      link
      fedilink
      arrow-up
      3
      ·
      2 months ago

      While there are 60K users in the room, only an incredibly small fraction are active there. It works like a Discord server where despite a server having multiple thousands of people in the server, only a small fraction of them will actively post in said server.

      These 60K users could be anything from dead accounts to lurkers outside of small percentage that post to the room. However, as an official room under direct supervision of the Matrix Foundation, this still gives a ridiculously bad first glance of the protocol to people.

    • Binzy_Boi@piefed.socialOP
      link
      fedilink
      arrow-up
      3
      ·
      2 months ago

      I’m currently on tchncs. As said in my last paragraph, that while I understand this doesn’t reflect on all servers, rooms, and spaces, it still makes me wonder if I should be using a protocol where the official space for it ran by the foundation building the protocol has such ineffective moderation to the extent that CSAM can exist in their biggest official room for multiple hours.

      To give a comparison, let’s take Tor. Tor is used for people to stay anonymous online. Now this of course is a bit of a double-edged sword since on the one hand, you have the intended purpose where people utilize it to circumvent censorship or just as a method to increase their privacy in their daily browsing habits. The other side is people who use it to access and distribute illegal services or content such as buying and selling drugs, or distributing and viewing CSAM.

      Now do I feel uncomfortable utilizing Tor because I see them as not doing enough to prevent the bad people from being on their network? Of course not, there’s a lot of great people who use the network from hobbyist projects to simply enhancing people’s privacy. However, if the Tor project had an onion site directly administered by them that constantly had moderation issues with CSAM or illegal services, then I would be more likely to utilize something like I2P because I wouldn’t be comfortable utilizing a service where the developing organization didn’t properly moderate that stuff on their own onion site.

      That’s the case with Matrix. I’m not even quite expecting them to be on top of everything sent to their server immediately, but this was the biggest room in their official space, and it becomes a question of whether I’d be comfortable using a protocol where the foundation behind it’s development doesn’t take adequate action dealing with CSAM in their own rooms.

  • Count042@lemmy.ml
    cake
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    2 months ago

    This is actually why I switched to xmpp for my chat server of about 200.

    The room being the atomic unit, and automatic caching of media from rooms that any user is a member of, even rooms hosted on other servers was too much of a legal liability.

    It became even worse when the CSAM users found my server that had open signups, but required a captcha and an email to signup.

    Deleting the users, banning the rooms, and deleting the cached media became a multiple times a day chore.

    Xmpp, so far, doesn’t have those issues. And it is so light on resources. I highly recommend the switch.

    EDIT: the moderation tools on xmpp are also far superior.