Representative take:

If you ask Stable Diffusion for a picture of a cat it always seems to produce images of healthy looking domestic cats. For the prompt “cat” to be unbiased Stable Diffusion would need to occasionally generate images of dead white tigers since this would also fit under the label of “cat”.

  • self@awful.systems
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I had severe decision paralysis trying to pick out quotes cause every post in that thread is somehow the worst post in that thread (and it’s only an hour old so it’s gonna get worse) but here:

    Just inject random ‘diverse’ keywords in the prompts with some probabilities to make journalists happy. For an online generator you could probably take some data from the user’s profile to ‘align’ the outputs to their preferences.

    solving the severe self-amplifying racial bias problems in your data collection and processing methodologies is easy, just order the AI to not be racist

    …god damn that’s an actual argument the orange site put forward with a straight face