• Amoeba_Girl@awful.systems
    link
    fedilink
    English
    arrow-up
    28
    ·
    edit-2
    1 day ago

    To be honest, as someone who’s very interested in computer generated text and poetry and the like, I find generic LLMs far less interesting than more traditional markov chains because they’re too good at reproducing clichés at the exclusion of anything surprising or whimsical. So I don’t think they’re very good for the unfactual either. Probably a homegrown neural network would have better results.

    • bitwolf@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      15
      ·
      1 day ago

      Agreed, our chat server ran a Markov chain bot for fun.

      In comparison to ChatGPT on a 2nd server I frequent it had much funnier and random responses.

      ChatGPT tends to just agree with whatever it chose to respond to.

      As for real world use. ChatGPT 90% of the time produces the wrong answer. I’ve enjoyed Circuit AI however. While it also produces incorrect responses, it shares its sources so I can more easily get the right answer.

      All I really want from a chatbot is a gremlin that finds the hard things to Google on my behalf.

      • Amoeba_Girl@awful.systems
        link
        fedilink
        English
        arrow-up
        12
        ·
        1 day ago

        Absolutely, every single one of these tools has got less interesting as they refine it so it can only output the platonic ideal of kitsch.