Researchers say that the model behind the chatbot fabricated a convincing bogus database, but a forensic examination shows it doesn’t pass for authentic.

    • PetDinosaurs@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      7 months ago

      It’s just modeling humans. I was only a lab TA for two semesters, and I caught so many fake data sets.

  • Ganbat@lemmyonline.com
    link
    fedilink
    English
    arrow-up
    17
    ·
    7 months ago

    So… Software designed to make things up made something up when asked to make something up? Okay…

  • Kecessa@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    7 months ago

    There was someone on radio the other day talking about doing research with it for their show and starting by asking a simple math question that it got wrong and then the conversation devolved into ChatGPT inventing anecdotes when asked about if a Nobel medal was ever brought to space and it ending up saying it didn’t know why it kept inventing anecdotes instead of finding reliable info!

    All to say, it doesn’t know what is and isn’t reliable information so it builds answers based on what it interprets you might want to read.