- cross-posted to:
- fediverse@lemmy.ml
- fediverse@lemmy.world
- cross-posted to:
- fediverse@lemmy.ml
- fediverse@lemmy.world
Lemmy experienced a CSAM attack this week, with some significant ramifications for the entire network. It started early in the week, where new accounts created on lemmy.world posted Child Sexual Abuse Material (CSAM) on multiple communities. This prompted the lemmy.world admins on Monday to set the registration to application only, with no more open signup on the server. The next day the CSAM attack continued, this time from accounts made on other servers that posted to communities on lemmy.world. As a response the lemmy.world admins closed the lemmyshitpost community, as that seemed to be the main focus of the attack.
This problem with CSAM on Lemmy differs from the problem that Mastodon has with CSAM, as reported on earlier this summer. When the Stanford Internet Observatory report found CSAM on Mastodon, it often existed below the surface, with the vast majority of users never encountering the material. The attack on Lemmy seems to have been executed with the purpose of getting people to see the material, as quite some people reported seeing the material.
One of the major impacts of this attack relates to technical design choices that Lemmy has made. Images that are posted on server A get send over and stored on server B, when someone on server B follows a community on server A. Images that are posted on lemmy.world, the biggest Lemmy server, exists in the databases of most other Lemmy servers as well. This means that due to the attack on lemmy.world, many Lemmy admins do now have images of CSAM in their database. With it comes liability for the admins, as well as reporting requirements. IFTAS has a good overview of the resources for admins to navigate these requirements.
Other aspects of Lemmy have confounded the issue of third party servers unwittingly hosting CSAM. It is currently not possible to federate with other Lemmy servers, and receive the text of a post, without also receiving and hosting the images of a post. Mastodon for example does allow servers to reject images while not rejecting text. Selective deletion of images in the database on Lemmy is also hard to do, and as a result, servers decided to delete all federated images in their database.
One of the ways admins deal with this new threat is with a new AI scanning tool called Lemmy Safety, created by the admin of the dbzer0 lemmy server. It scans all images in the Lemmy database for potential CSAM, and automatically deletes the images, and can also be used to scan newly incoming images. While this can help in the short term with making sure there is no CSAM material, it might interfere with legal obligations that administrators have. In various jurisdictions, administrators are required to report to the relevant authorities when they become aware of CSAM. Again, this collection of resources by IFTAS is a good start with helpful information.
It is clear that this is a complicated problem for volunteer admins to deal with. Multiple administrators concluded that the risks and complications of continuing to host Lemmy servers is not worth it. Other servers, such as lemm.ee have made extensive plans on how to deal with the situation, such as disabling image uploads, and applying a custom patch to prevent images from other servers to be saved on their server. They also float the idea of an invite-based registration system.
On the Matrix chat channels for Lemmy admins, tension is rising, and people are frustrated with the lack of acknowledgement and communications from the developers @dessalines and @nutomic. The developers have not communicated anything about this on either their Matrix chat channels or on their Lemmy. On their GitHub, the dbzer0 admin proposed to expand his automated CSAM scanning to allow for saving and review potential hits, instead of outright deletion. Developer @dessalines stated that this “is not something we have time for rn.” For servers that are operated under US law however, administrators are mandated to save CSAM they encounter, report it to the authorities, make it not visible for users, and restrict access to the saved material as best as possible. The outright rejection by the main developer to build tools that can admins satisfy these legal requirements does not help the confidence of admins who are worried about their responsibilities.
Meanwhile, new reports are starting to pop up of a new type of CSAM attack. Posts that are titled ‘Tiktok Cringe’, and first show a few seconds of a random tiktok video, and then switch to CSAM material. This makes it really easy for moderators to miss the content, unless they watch the entire video. At this point, it is unclear if this was an isolated incident, or part of a bigger attack. How this situation will develop in the near future is out in the open, but I’m sure we’ll come back to it soon.
Social network Minds has been working on implementing ActivityPub, and are now mostly connected to the fediverse. Minds, which launched in 2015, has a strong focus on free speech and cryptocurrency. As such, multiple outlets report the far-right nature of its user base. Minds reported that they joined the fediverse in a not particularly clear post. So far it seems like posts made on Minds are visible on Mastodon, but comments made by Mastodon users on a post made by Minds, are not visible on the Minds’ platform itself. The culture and ethics of Minds seems to differ significantly from that of most fediverse servers, and if Minds becomes more prominently visible within the fediverse, this will likely lead to friction and conversations around defederation. On the other hand, it does give another indication that ActivityPub is becoming the standard protocol for other social networks to implement.
A contributor to the Tusky project (an open source Android client for Mastodon) leaves the project, and writes a blog post alleging financial mismanagement. The other contributors write an extensive explanation of the situation, denying the allegations. While the situation itself is not particularly impactful for the fediverse, it is a good illustration of how difficult the organisational aspect of collectively building software on the fediverse is.
Dessalines is a friend of mine. What happened to him?
The other mods except for one that I know suck, imho.