• Blue_Morpho@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    2
    ·
    13 hours ago

    More efficient hardware use should be amazing for AI since it allows you to scale even further.

    If you can achieve scaling with software, you can delay current plans for expensive hardware. If a new driver came out that gave Nvidia 5090 performance to games with gtx1080 equivalent hardware would you still buy a new video card this year?

    When all the Telcos scaled back on building fiber in 2000, that was because they didn’t have a positive outlook for the Internet?

    Or when video game companies went bankrupt in the 1980’s, it was because video games were over as entertainment?

    There’s a huge leap between not spending billions on new data centers ( which are used for more than just AI), and claiming that’s the reason AI is over.

    • FooBarrington@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 hours ago

      If a new driver came out that gave Nvidia 5090 performance to games with gtx1080 equivalent hardware would you still buy a new video card this year?

      It doesn’t make any sense to compare games and AI. Games have a well-defined upper bound for performance. Even Crysis has “maximum settings” that you can’t go above. Supposedly, this doesn’t hold true for AI, scaling it should continually improve it.

      So: yes, in your analogy, MS would still buy a new video card this year if they believed in the progress being possible and reasonably likely.

      • Blue_Morpho@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        37 seconds ago

        Like games have diminished returns on better graphics (it’s already photo realistic few pay $2k on a GPU for more hairs?), AI has a plateau where it gives good enough answers that people will pay for the service.

        If people are paying you money and the next level of performance is not appreciated by the general consumer, why spend billions that will take longer to recoup?

        And again data centers aren’t just used for AI.

    • Takumidesh@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      12 hours ago

      If buying a new video card made me money, yes.

      This doesn’t really work, because the goal when you buy a video card isn’t to have the most possible processing power ever and playing video games doesn’t scale linearly so having an additional card doesn’t add anything.

      If I was mining crypto, or selling GPU compute (which is basically what ai companies are doing) and the existing card got an update that made it perform on par with new cards, I would buy out the existing cards and when there are no more, I would buy up the newer cards, they are both generating revenue still.

      • Blue_Morpho@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        edit-2
        10 hours ago

        If buying a new video card made me money, yes

        But this is the supposition that not buying a video card makes you the same money. You’re forecasting free performance upgrades so there’s no need to spend money now when you can wait and upgrade the hardware once software improvements stop.

        And that’s assuming it has anything to do with AI but the long term macroeconomics of Trump destroying the economy so MS is putting off spending when businesses will be slowing down because of the tariff war.