• kbin_space_program@kbin.run
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    edit-2
    3 months ago

    Because the inherent design of modern AIs is not deterministic.

    Adding a progressively bigger model cannot fix that. We need an entirely new approach to AI to do that.

    • Immersive_Matthew@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      3 months ago

      Bigger models do start to show more emergent intelligent properties and there are components being added to the LLM to make them more logical and robust. At least this is what OpenAI and others are saying about even bigger datasets.

    • Onno (VK6FLAB)@lemmy.radio
      link
      fedilink
      arrow-up
      2
      arrow-down
      2
      ·
      3 months ago

      For me the biggest indicator that we’ve barking up the wrong tree is energy consumption.

      Consider the energy required to feed a human with that required to train and run the current “leading edge” systems.

      From a software development perspective, I think machine learning is a very useful way to model unknown variables, but that’s not the same as “intelligence”.