cross-posted from: https://beehaw.org/post/6853479
mastodon.art has decided to suspend firefish.social from their instance due to issues with its administrator. The administrator of firefish.social was found to be boosting posts from a known harasser on another instance. mastodon.art takes a firm stance against racism and suspending full instances in these situations is part of their policy as a safe space. The known harasser has a history of using slurs, harassment, and editing screenshots to spread misinformation. However, the administrator of firefish.social has now forged a screenshot to paint mastodon.art in a negative light.
It’s not an issue, it’s an intentional and important feature.
Don’t want to be defederated? Don’t let chuds and bigots on your instance. It’s pretty simple.
While this is the main reason for defederation, I think it’s important to recognize that humans are going to human and as of such you’re going to have defederation over extremely petty issues. In human history we’ve literally started wars over petty issues, costing countless lives - defederating is small stakes in comparison.
With that being said I agree with other posters that defederation is a tool. Just like any other tool it will be used in ways not everyone expects. A hammer can be used as a can opener if you really want. Or as art. Or in an elaborate machine. Tools may be designed for a purpose, but humans are creative and you can’t enforce that tools are only used in certain ways.
deleted by creator
I would advise against armchair hypothesizing about the mental health state of individuals based on how they post online.
Hehe, the irony of this… is that’s a good rule for this instance… but the whole kerfuffle seems to be based on armchair hypothesizing about individuals not just by how they post online, but by how someone who they might be two or three times removed from by online association, may have posted online at some point.
Guess it goes to show what happens without that rule.
Yes and it’s already been discussed whether this post should be removed. There’s no quick and easy answer to a question like this, so much as there is a lot of shades of gray. There can be valuable discussion here so long as we take into consideration how to do so in good faith in a public forum.
There are at least two technological solutions to that at an end-user level:
And one at a pro-user level:
There are proposals for creating “councils” that could keep blacklists, whitelists, chains of trust, and whatever else, but once analyzed more in depth, they all seem to lead to more knee jerk reactions, not less.
I think the current discussion is a good example showing that it’s not as black and white, and definitely not easy, as you make it out to be.