AI’s voracious need for computing power is threatening to overwhelm energy sources, requiring the industry to change its approach to the technology, according to Arm Holdings Plc Chief Executive Officer Rene Haas.

  • FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    9
    arrow-down
    15
    ·
    7 months ago

    No, it makes no sense. India has over a billion people. There’s no way that amount of computing power could just magically have poofed into existence over the past few years, nor the power plants necessary to run all of that.

    • The Octonaut
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      7 months ago

      This is a future prediction, not a current observation.

      I’m not saying it’s correct as a prediction, but “where are the extra power plants” is not good counter-argument.

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        4
        arrow-down
        15
        ·
        7 months ago

        A couple of months ago the average temperature where I live was well below freezing. Now it’s around twenty degrees C.

        By this time next year it’ll be thousands of degrees!

    • BakerBagel@midwest.social
      link
      fedilink
      English
      arrow-up
      8
      ·
      7 months ago

      The current LLM’S kinda suck, but companies have fired huge swaths of their staff and plan in putting LLMs in their place. Either those companies hire back all those workers, or they get the programs to not suck. And making LLMs actually capable of working unsupervised will take more and more energy.

      • kakes@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        7 months ago

        My take is that LLMs are absolutely incredible… for personal use and hobby projects. I can’t think of a single task I would trust an LLM to perform entirely unsupervised in a business context.

        Of course, that’s just where LLMs are at today, though. They’ll improve.

      • MakePorkGreatAgain@lemmy.basedcount.com
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        7 months ago

        LLM’s will probably improve at an exponential level - similar to how cpu’s did in the 80s/90s. in ~10 generations the LLMs will likely be very useful

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        1
        arrow-down
        14
        ·
        7 months ago

        Sure, but it’s simply not physically possible for AI to be consuming that much power. Not enough computers exist, and not enough ability to manufacture new ones fast enough. There hasn’t been a giant surge of new power plants built in just the past few years, so if something was suddenly drawing an India’s worth of power then somewhere an India’s worth of consumers just went dark.

        This just isn’t plausible.

    • Revv@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      7 months ago

      If only there had been another widespread, wasteful prior use of expensive and power hungry compute equipment that suddenly became less valuable/effective and could quickly be repurposed to run LLMs…

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        4
        arrow-down
        7
        ·
        7 months ago

        Pretty sure the big AI corps aren’t depending on obsolete second-hand half-burned-out Ethereum mining rigs for their AI training.