• MalReynolds@piefed.social
    link
    fedilink
    English
    arrow-up
    62
    ·
    2 days ago

    Didn’t OpenAI just sign a deal with Samsung and SK Hynix to consume 40% of the worlds DRAM wafer supply for the foreseeable future.

    This is a natural consequence of sucking up all the oxygen.

    Fucking AI bubble…

      • Luffy@lemmy.ml
        link
        fedilink
        English
        arrow-up
        23
        ·
        2 days ago

        With enterprise grade LLM Hardware.

        Now, if you dont intend on soldering 128 Gigs of dram on your rtx 2080, its not gonna be a big advantage to you

        • SmoochyPit@lemmy.ca
          link
          fedilink
          English
          arrow-up
          12
          ·
          2 days ago

          Minecraft with 512 chunk render distance or 2048px resolution textures is gonna be fire tho

  • Washedupcynic@lemmy.ca
    link
    fedilink
    English
    arrow-up
    14
    ·
    2 days ago

    I had a knee jerk reaction to the headline, and I want to clarify for the TL;DR people. They are talking about stopping manufacturing lower and mid-range graphics cards. They aren’t going to roll out an update that would brick cards that have already been sold to customers; (that was my knee jerk assumption, because manufacturers have done shit like that before and I assume the worst.)

  • humanspiral@lemmy.ca
    link
    fedilink
    English
    arrow-up
    11
    ·
    2 days ago

    This is extremely serious for economic bubble.

    Orders for datacenter AI chips exceed supply, and more high end/other memory per TSMC wafer is further nightmare. This is likely to mean higher prices per token for datacenter buyers, and higher prices for users/model renters, and much slower demand growth and AI progress. It also means long delays for datacenters, and better black market (China, higher than MSRP diversions from contracted deliveries).

    I’m not sure if affects phone/lpddr soldered memory, but tsmc is going to charge more for phone chips too. This can cause whole consumer/business computing market to collapse. Return of older generation designs on underused process nodes will give little reason to upgrade, and still overcharge. This can be an opening for China exports of competing products that were not possible at low/reasonable ram/tsmc prices/availability, where even if China has difficulty achieving best yields, it’s still profitable to invest/expand aggressively, that discourages US/western colonies from investing.

    This race to give the US Skynet, for stronger political control/social credit/surveillance of Americans, can make a bubble in everything else, and accelerate financial collapse, all the while making the goal impossible to achieve and forcing China to become stronger/more resilient, with greater share of global computing supply.

    • Frezik@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      10
      ·
      2 days ago

      It affects all DRAM across the board. The fundamentals of DRAM haven’t changed in decades, and everything comes from three companies.

      Good thing Microsoft forced people to throw away a bunch of perfectly functional PCs. This was the perfect time for everyone to have to buy new ones.

      • muusemuuse@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        They didn’t force them to throw away PCs. They have alternative options. People chose to be angry instead of doing anything about it.

        It’s the shit people removed about Apple for doing, but it’s somehow adorable when Microsoft does it.

      • Washedupcynic@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        Linux has entered the chat. I installed linux on my older machine that’s ~ 7 years old. My new machine has windows 11 and I fucking hate it.

    • jaykrown@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 day ago

      Absolutely. Something you missed though, at the same time AI models are becoming more efficient and cheaper to run, so these data centers are going to be a massive waste of resources in a year.

      • humanspiral@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        19 hours ago

        if tsmc only makes datacenter chips from now on, then “we” are shut out from the huge privacy (and fine tuning specialization) gains given by small efficient cheap to run models (or play games on new hardware). US datacenters will serve US empire/political establishment both with government as main LLM customer, but also for data collection/palantir ontology/social credit scores on every American.

        I suspect that better datacenter chips won’t actually reduce their cost due to supply limitations, but even for small efficient models, personal hardware has a long payback period compared to a per token “rental” cloud charge. It is unlikely that all of the datacenter chip buyers will have non-government customers to use them all, and so either bailout or bankruptcy followed by megatech buying the datacenters for cheap followed by a bailout in government revenue for big tech global/citizen control applications.

        Eventually, even the government has too much AI resources, at planned expansion pace, and then consumer/business computing/gpu market comes back. Could be as soon as 2026 that a collective understanding of absurdity occurs.

    • DdCno1@beehaw.org
      link
      fedilink
      English
      arrow-up
      15
      ·
      2 days ago

      What emerging players? You can’t just whip up a competitive GPU in a jiffy, even if you have Intel money.

      Also, unless they are from a different planet that has its own independent supply chain, they’d have to deal with the very same memory shortage and the very same foundries that are booked out for years.

      • humanspiral@lemmy.ca
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 days ago

        Chinese emerging players who are shut out from tsmc/taiwan/ROK chips and memory sources.

      • luciole (he/him)@beehaw.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        If the big two completely abandon the low-mid market Intel, Lisuan, Moore Thread or whatever might put the little DRAM they manage to grab on that orphaned market. A large proportion of gamers aren’t into buying 2000$ GPUs. Those companies might not succeed right away but they’ve been cooking for a while already so I wonder.

        • tempest@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          ·
          21 hours ago

          The Price after the RAM increase would bump those cards up market and out of reach for the people in the segment.

        • Frezik@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          2
          ·
          23 hours ago

          To sell at a loss, or at least very low profit? Low end GPUs tend to have tight margins to begin with. Why stick limited DRAM in there when there are products that need it that can actually be sold for profit?

          I guess they can be a loss leader. It’s not a sustainable business model, though, and this DRAM shortage is projected to last a while.

          • luciole (he/him)@beehaw.org
            link
            fedilink
            English
            arrow-up
            2
            ·
            22 hours ago

            I agree. In retrospect When I said “big opportunity” I was pushing it. More of a (narrow) potential opening to try for a modest market share. I guess I’m just hoping affordable GPUs remain a thing.