I suspect that this is the direct result of AI generated content just overwhelming any real content.

I tried ddg, google, bing, quant, and none of them really help me find information I want these days.

Perplexity seems to work but I don’t like the idea of AI giving me “facts” since they are mostly based on other AI posts

  • Lvxferre
    link
    fedilink
    English
    arrow-up
    2
    ·
    4 小时前

    Or, in a deeper aspect: they’re pretty good at regurgitating what we interpret as bullshit. They simply don’t care about truth value of the statements at all.

    That’s part of the problem - you can’t prevent them from doing it, it’s like trying to drain the ocean with a small bucket. They shouldn’t be used as a direct source of info for anything that you won’t check afterwards; at least in kitnaht’s use case if the LLM is bullshitting it should be obvious, but go past that and you’ll have a hard time.