• SavedKriss@lemmy.world
    link
    fedilink
    English
    arrow-up
    38
    ·
    7 months ago

    Wouldn’t it be easier to hardcode in the servers an entire encyclopedia instead of trying to limit a generative model to give only “right” answers?

    • jeffw@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      15
      ·
      edit-2
      7 months ago

      That cant answer most questions though. For example, I hung a door recently and had some questions that it answered (mostly) accurately. An encyclopedia can’t tell me how to hang a door

      • Balder@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        7 months ago

        Yeah, there’s a reason this wasn’t done before generative AI. It couldn’t handle anything slightly more specific.

      • linearchaos@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        7 months ago

        Same I was dealing with a strange piece of software I searched configs and samples for hours and couldn’t find anything about anybody having any problems with the weird language they use. I finally gave up and asked gpt, it explained exactly what was going wrong and gave me half a dozen answers to try to fix it.

      • btaf45@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        7 months ago

        That cant answer most questions though.

        It would make AI much more trustworthy. You cannot trust chatGPT on anything related to science because it tells you stuff like the Andromeda galaxy being inside the Milky Way. The only way to fix that is to directly program basic known science into the AI.

          • btaf45@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            7 months ago

            It depends on how you ask the question, and there is also a randomization done on the AI answer that you get. The point is that you would be foolish to trust AI to accurately answer science questions. Why the f*ck would you want a randomized answer to a science question?

            ME: how far is andromeda from caldwell 70?

            ChatGPT: Caldwell 70, also known as NGC 7000 or the North America Nebula, is an emission nebula located in the constellation Cygnus. It is much closer to us than the Andromeda Galaxy. The North America Nebula is estimated to be around 1,600 light-years away from Earth. Therefore, in comparison to the distance between the Milky Way and Andromeda, Caldwell 70 is significantly closer to us.

            In fact Caldwell 70 is over twice as far as Andromeda, because Caldwell 70 in NGC 300, not NDC 7000 (Caldwell-20). Also, the AI didn’t even answer the question that I actually asked.

            • btaf45@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              7 months ago

              LMFAO This time the AI doesn’t even know that 6.12 > 2.537. It doesn’t even know to use the computer to compute things even though it literally runs inside a computer.

              Me: is andromeda or ngc300 closer to us?

              ChatGPT: Between the Andromeda Galaxy (M31) and NGC 300, NGC 300 is closer to us. NGC 300 is a spiral galaxy located in the constellation Sculptor, and it’s approximately 6.12 million light-years away from Earth. In contrast, the Andromeda Galaxy (M31) is much farther, at around 2.537 million light-years away. Therefore, NGC 300 is closer to us than the Andromeda Galaxy.

    • ilinamorato@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      7 months ago

      Google wants that to work. That’s why the “knowledge panels” kept popping up at the top of search before now with links to Wikipedia. They only want to answer the easy questions; definitions, math problems, things that they can give you the Wikipedia answer for, Yelp reviews, “Thai Food Near Me,” etc. They don’t want to answer the hard questions; presumably because it’s harder to sell ads for more niche questions and topics. And “harder” means you have to get humans involved. Which is why they’re complaining now that users are asking questions that are “too hard for our poor widdle generative AI to handle :-(”— they don’t want us to ask hard questions.