• FaceDeer@kbin.social
    link
    fedilink
    arrow-up
    2
    arrow-down
    2
    ·
    11 months ago

    It’s not repeating its training data verbatim because it can’t do that. It doesn’t have the training data stored away inside itself. If it did the big news wouldn’t be AI, it would be the insanely magical compression algorithm that’s been discovered that allows many terabytes of data to be compressed down into just a few gigabytes.

    • Hello Hotel@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      11 months ago

      Do you remember quotes in english ascii /s

      Tokens are even denser than ascii. simmlar to word “chunking” My guess is it’s like lossy video compression but for text, [Attacked] with [lazers] by [deatheaters] apon [margret];[has flowery language]; word [margret] [comes first] (Theoretical example has 7 “tokens”)

      It may have actually impressioned a really good copy of that book as it’s lilely read it lots of times.