• MxM111@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I can say exactly the same thing LLM always hallucinate, just sometimes they do it correctly and sometimes not.

    • sab@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      So you’d say it’s a hallucination machine rather than a bullshit generator?

      I think you’re on to a good point - the industry seems to say their model is hallucinating whenever is does something they don’t approve of, but the fact of the matter is that it does the exact same thing as it always does.