So you’d say it’s a hallucination machine rather than a bullshit generator?
I think you’re on to a good point - the industry seems to say their model is hallucinating whenever is does something they don’t approve of, but the fact of the matter is that it does the exact same thing as it always does.
I can say exactly the same thing LLM always hallucinate, just sometimes they do it correctly and sometimes not.
So you’d say it’s a hallucination machine rather than a bullshit generator?
I think you’re on to a good point - the industry seems to say their model is hallucinating whenever is does something they don’t approve of, but the fact of the matter is that it does the exact same thing as it always does.