• 0 Posts
  • 103 Comments
Joined 6 months ago
cake
Cake day: December 16th, 2023

help-circle











  • I think the type of situation where you start a business under your own labor, and then hire others to exploit their labor is possible in a well portioned socialist economy.

    You wouldn’t be able to hire “just a dishwasher.” You’d enter into a mutually and equally benefitial partnership with somebody who you would share labor and extract the revenue from that labor equally.

    The exact type of that relationship and the responsibility details are largely immaterial to whether it’s exploitative at that point.




  • I suppose the importance of the openness of the training data depends on your view of what a model is doing.

    If you feel like a model is more like a media file that the model loaders are playing back, where the prompt is more of a type of control over how you access this model then yes I suppose from a trustworthiness aspect there’s not much to the model’s training corpus being open

    I see models more in terms of how any other text encoder or serializer would work, if you were, say, manually encoding text. While there is a very low chance of any “malicious code” being executed, the importance is in the fact that you can check the expectations about how your inputs are being encoded against what the provider is telling you.

    As an example attack vector, much like with something like a malicious replacement technique for anything, if I were to download a pre-trained model from what I thought was a reputable source, but was man-in-the middled and provided with a maliciously trained model, suddenly the system I was relying on that uses that model is compromised in terms of the expected text output. Obviously that exact problem could be fixed with some has checking but I hope you see that in some cases even that wouldn’t be enough. (Such as malicious “official” providence)

    As these models become more prevalent, being able to guarantee integrity will become more and more of an issue.