So you’d say it’s a hallucination machine rather than a bullshit generator?
I think you’re on to a good point - the industry seems to say their model is hallucinating whenever is does something they don’t approve of, but the fact of the matter is that it does the exact same thing as it always does.
So you’d say it’s a hallucination machine rather than a bullshit generator?
I think you’re on to a good point - the industry seems to say their model is hallucinating whenever is does something they don’t approve of, but the fact of the matter is that it does the exact same thing as it always does.