• jacksilver@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    ·
    5 days ago

    This is something I’ve been speculating for a while. The cost of running these complex systems (as OpenAI models aren’t just LLMs) is subsidized so heavily that we don’t really know the cost of running these things.

    This is a huge risk to any business, as the price for these services has to go up significantly in the long term.

    • David Gerard@awful.systemsM
      link
      fedilink
      English
      arrow-up
      13
      ·
      4 days ago

      Ed Zitron calculated from the publicly available numbers that OpenAI was spending $2.35 for every $1 of ChatGPT they sell

      • jacksilver@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        4 days ago

        Is that for all operations or literally just to run the paid services? Cause if that includes the free services, marketing, R&D then they have a lot of options to cut costs.

        Given what AWS/etc. charge for their LLMs/APIs it feels like the entire industry is subsidizing LLM compute to stay competitive. But I could be wrong there.

    • baldingpudenda@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      4 days ago

      Was it altman that tweeted they were near the singularity? I assumed it was a way to raise money. Felt more like “Fuck! We need more money to burn.”

      • froztbyte@awful.systems
        link
        fedilink
        English
        arrow-up
        9
        ·
        4 days ago

        they were only “near AGI” before their most recent funding rounds closed, after that they were “a few thousand days” away