• jj4211@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    4 hours ago

    There are signs of three distinct interpretations in the result:

    • On topic, the concept of cleaning a wild bird you are trying to save
    • Preparing a store bought Turkey (removing a label)
    • Preparing a wild bird that is caught

    It’s actually a pretty good illustration of how AI assembles “information shaped text” and how smooth it can look and yet how dumb it can be about it. Unfortunately advocates will just say “I can’t get this specific thing wrong when I ask it or another LLM, so there’s no problem”, even as it gets other stuff wrong. It’s weird as you better be able to second guess the result, meaning you can never be confident in an answer you didn’t already know, but when that’s the case, it’s not that great for factual stuff.

    For “doesn’t matter” content, it may do fine (generated alternatives to stock photography, silly meme pictures, random prattle from background NPCs in a game), but for “stuff that matters”, Generative AI is frequently more of a headache than a help.