• Alphane Moon
    link
    fedilink
    English
    241
    edit-2
    3 months ago

    I really hope this is the beginning of a massive correction on AI hype.

      • GreatAlbatross
        link
        fedilink
        English
        693 months ago

        Or from the sounds of it, doing things more efficiently.
        Fewer cycles required, less hardware required.

        Maybe this was an inevitability, if you cut off access to the fast hardware, you create a natural advantage for more efficient systems.

        • @sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          383 months ago

          That’s generally how tech goes though. You throw hardware at the problem until it works, and then you optimize it to run on laptops and eventually phones. Usually hardware improvements and software optimizations meet somewhere in the middle.

          Look at photo and video editing, you used to need a workstation for that, and now you can get most of it on your phone. Surely AI is destined to follow the same path, with local models getting more and more robust until eventually the beefy cloud services are no longer required.

          • @jmcs@discuss.tchncs.de
            link
            fedilink
            English
            433 months ago

            The problem for American tech companies is that they didn’t even try to move to stage 2.

            OpenAI is hemorrhaging money even on their most expensive subscription and their entire business plan was to hemorrhage money even faster to the point they would use entire power stations to power their data centers. Their plan makes about as much sense as digging your self out of a hole by trying to dig to the other side of the globe.

            • @sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              English
              173 months ago

              Hey, my friends and I would’ve made it to China if recess was a bit longer.

              Seriously though, the goal for something like OpenAI shouldn’t be to sell products to end customers, but to license models to companies that sell “solutions.” I see these direct to consumer devices similarly to how GPU manufacturers see reference cards or how Valve sees the Steam Deck: they’re a proof of concept for others to follow.

              OpenAI should be looking to be more like ARM and less like Apple. If they do that, they might just grow into their valuation.

      • @theunknownmuncher@lemmy.world
        link
        fedilink
        English
        38
        edit-2
        3 months ago

        China really has nothing to do with it, it could have been anyone. It’s a reaction to realizing that GPT4-equivalent AI models are dramatically cheaper to train than previously thought.

        It being China is a noteable detail because it really drives the nail in the coffin for NVIDIA, since China has been fenced off from having access to NVIDIA’s most expensive AI GPUs that were thought to be required to pull this off.

        It also makes the USA gov look extremely foolish to have made major foreign policy and relationship sacrifices in order to try to delay China by a few years, when it’s January and China has already caught up, those sacrifices did not pay off, in fact they backfired and have benefited China and will allow them to accelerate while hurting USA tech/AI companies

      • @golli@lemm.ee
        link
        fedilink
        English
        273 months ago

        It’s a reaction to thinking China has better AI

        I don’t think this is the primary reason behind Nvidia’s drop. Because as long as they got a massive technological lead it doesn’t matter as much to them who has the best model, as long as these companies use their GPUs to train them.

        The real change is that the compute resources (which is Nvidia’s product) needed to create a great model suddenly fell of a cliff. Whereas until now the name of the game was that more is better and scale is everything.

        China vs the West (or upstart vs big players) matters to those who are investing in creating those models. So for example Meta, who presumably spends a ton of money on high paying engineers and data centers, and somehow got upstaged by someone else with a fraction of their resources.

        • mapumbaa
          link
          fedilink
          English
          23 months ago

          I really don’t believe the technological lead is massive.

          • @golli@lemm.ee
            link
            fedilink
            English
            13 months ago

            Looking at the market cap of Nvidia vs their competitors the market belives it is, considering they just lost more than AMD/Intel and the likes are worth combined and still are valued at $2.9 billion.

            And with technology i mean both the performance of their hardware and the software stack they’ve created, which is a big part of their dominance.

            • mapumbaa
              link
              fedilink
              English
              23 months ago

              Yeah. I don’t believe market value is a great indicator in this case. In general, I would say that capital markets are rational at a macro level, but not micro. This is all speculation/gambling.

              My guess is that AMD and Intel are at most 1 year behind Nvidia when it comes to tech stack. “China”, maybe 2 years, probably less.

              However, if you can make chips with 80% performance at 10% price, its a win. People can continue to tell themselves that big tech always will buy the latest and greatest whatever the cost. It does not make it true. I mean, it hasn’t been true for a really long time. Google, Meta and Amazon already make their own chips. That’s probably true for DeepSeek as well.

      • @nieceandtows@lemmy.world
        link
        fedilink
        English
        123 months ago

        From what I understand, it’s more that it takes a lot less money to train your own llms with the same powers with this one than to pay license to one of the expensive ones. Somebody correct me if I’m wrong

        • @CheeseNoodle@lemmy.world
          link
          fedilink
          English
          43 months ago

          I wouldn’t be surprised if China spent more on AI development than the west did, sure here we spent tens of billions while China only invested a few million but that few million was actually spent on the development while out of the tens of billions all but 5$ was spent on bonuses and yachts.

      • bobalot
        link
        fedilink
        English
        63 months ago

        Does it still need people spending huge amounts of time to train models?

        After doing neural networks, fuzzy logic, etc. in university, I really question the whole usability of what is called “AI” outside niche use cases.

      • @tburkhol@lemmy.world
        link
        fedilink
        English
        33 months ago

        Exactly. Galaxy brains on Wall Street realizing that nvidia’s monopoly pricing power is coming to an end. This was inevitable - China has 4x as many workers as the US, trained in the best labs and best universities in the world, interns at the best companies, then, because of racism, sent back to China. Blocking sales of nvidia chips to China drives them to develop their own hardware, rather than getting them hooked on Western hardware. China’s AI may not be as efficient or as good as the West right now, but it will be cheaper, and it will get better.

    • @givesomefucks@lemmy.world
      link
      fedilink
      English
      443 months ago

      It’s coming, Pelosi sold her shares like a month ago.

      It’s going to crash, if not for the reasons she sold for, as more and more people hear she sold, they’re going to sell because they’ll assume she has insider knowledge due to her office.

      Which is why politicians (and spouses) shouldn’t be able to directly invest into individual companies.

      Even if they aren’t doing anything wrong, people will follow them and do what they do. Only a truly ignorant person would believe it doesn’t have an effect on other people.

      • @ShinkanTrain@lemmy.ml
        link
        fedilink
        English
        53
        edit-2
        3 months ago

        It’s coming, Pelosi sold her shares like a month ago.

        Yeah but only cause she was really disappointed with the 5000 series lineup. Can you blame her for wanting real rasterization improvements?

        • @givesomefucks@lemmy.world
          link
          fedilink
          English
          143 months ago

          Everyone’s disappointed with the 5000 series…

          They’re giving up on improving rasterazation and focusing on “ai cores” because they’re using gpus to pay for the research into AI.

          “Real” core count is going down on the 5000 series.

          It’s not what gamers want, but they’re counting on people just buying the newest before asking if newer is really better. It’s why they’re already cutting 4000 series production, they just won’t give people the option.

          I think everything under 4070 super is already discontinued

        • @Trainguyrom@reddthat.com
          link
          fedilink
          English
          13 months ago

          You joke but there’s a lot of grandma/grandpa gamers these days. Remember someone who played PC games back in the 80s would be on their 50s or 60s now. Or even older if they picked up the hobby as an adult in the 80s

    • SuiXi3D
      link
      fedilink
      183 months ago

      I just hope it means I can get a high end GPU for less than a grand one day.

      • @NuXCOM_90Percent@lemmy.zip
        link
        fedilink
        English
        63 months ago

        Prices rarely, if ever, go down and there is a push across the board to offload things “to the cloud” for a range of reasons.

        That said: If your focus is on gaming, AMD is REAL good these days and, if you can get past their completely nonsensical naming scheme, you can often get a really good GPU using “last year’s” technology for 500-800 USD (discounted to 400-600 or so).

      • @manicdave@feddit.uk
        link
        fedilink
        English
        4
        edit-2
        3 months ago

        I’m using an Rx6700xt which you can get for about £300 and it works fine.

        Edit: try using ollama on your PC. If your CPU is capable, that software should work out the rest.

    • @FooBarrington@lemmy.world
      link
      fedilink
      English
      93 months ago

      If anything, this will accelerate the AI hype, as big leaps forward have been made without increased resource usage.

      • Alphane Moon
        link
        fedilink
        English
        24
        edit-2
        3 months ago

        Something is got to give. You can’t spend ~$200 billion annually on capex and get a mere $2-3 billion return on this investment.

        I understand that they are searching for a radical breakthrough “that will change everything”, but there is also reasons to be skeptical about this (e.g. documents revealing that Microsoft and OpenAI defined AGI as something that can get them $100 billion in annual revenue as opposed to some specific capabilities).