How long until GPUs get replaced by APUs for everyone but the very autistic (see: sound cards)?

How long until GPUs get replaced by APUs for everyone but the very autistic (see: sound cards)?

  1. 4 weeks ago
    Anonymous

    Sound is an easy workload, with graphics they're always pumping up the resolution, the visual fidelity and introducing harder workloads so who the fuck knows.

    • 4 weeks ago
      Anonymous

      So turn down the settings.

      RETARD.

    • 4 weeks ago
      Anonymous

      >Sound is an easy workload
      wrong

  2. 4 weeks ago
    Anonymous

    not any time soon. Gamers love their eye candy and graphics are embarrassingly parallel, which means the speed you can get is more or less a function of the power you're willing to draw (and dissipate), which is way way more limited in an APU than a dGPU. Oh yeah and now we have AI; a non-gaming GPU workload that tons of ordinary people actually want to run. And those people care about making it fast, bumping into the same power-dissipation problem.

    if you aren't doing either of those two things you haven't needed a dGPU for like ten years now, unless you just need more outputs.

    • 4 weeks ago
      Anonymous

      The majority of gamers use an APU, whether it be their laptop or Console.

      • 4 weeks ago
        Anonymous

        Trannies don't count

        • 4 weeks ago
          Anonymous

          Havent you noticed PC gamers have the elevated rate of trannies? Console and laptops are the normies.

    • 4 weeks ago
      Anonymous

      We are already there

      Lots of gamers don't care about having the best graphics as long as the game will run. As for Ai, most people do that online using other people's computers so local performance doesn't matter there anyway.

      • 3 weeks ago
        Anonymous

        >Lots of gamers don't care about having the best graphics as long as the game will run.
        nope, unless youre doing le indie retro game or something for a very specific niche, graphics are pretty much always the main selling point right after the ip the game is based on

        • 3 weeks ago
          Anonymous

          Seeing graphics in trailers activates monkey neurons and makes people want to play it but 90% of people will what up playing it on low/medium anyways

  3. 4 weeks ago
    Anonymous

    It doesn't solve a problem so never
    Market segmentation means not likely. For simple example:

    If I want power then GPU brand
    If I want form factor chromebook
    If I want streamlining Apple

    Apple can't apu because they are proprietary.
    Chromebook can't apu because ironically higher power and manufacturing cost requirements.
    A gpu laptop well, it's a scam but the value proposition is still worse because it sounds like integrated graphics

    So never?

  4. 4 weeks ago
    pixDAIZ

    Probably never because most cheap consumer motherboards are geared toward ~100-200W of power delivery on the CPU socket which has to be shared between the CPU and GPU. Thus you can either max out CPU or GPU performance but not both concurrently while gaming so a bottleneck will always persist. When you look at console frametimes you'll notice that only half the games are able to consistently maintain 60 FPS as 1% lows.

    Half of LULZ kept creaming themselves speculating on some kind of alien technology from the future that would allow the PS5 APU from accessing hundreds and hundreds of watts of power delivery while gaming and thus enable it to compete with $1,000 PC builds at the time. In reality less than 200 watt power consumption is observed which explains picrel as the CPU and GPU fight over power.

  5. 4 weeks ago
    Anonymous
  6. 4 weeks ago
    Anonymous

    I think professionals will be using discrete GPUs for a while.

  7. 4 weeks ago
    Anonymous

    Whenever cloud/streaming becomes the "standard" games are delivered by.
    It'll happen at some point.

  8. 4 weeks ago
    Anonymous

    they won't turn into sound cards, the reason is that sound cards got as good as they ever need to be 20-25 years ago, like there's just no way to make them sound better, and nothing does effects or mixing on sound cards since 15-20 years ago either, they're just a glorified buffer+dac+amp like the old days, only now even the cheapest parts do transparent quality no prob
    there's simply no room to make better sound cards, they're finished

    graphics cards on the other hand, there's no end in sight regarding making those better, and where there's a way to make one better, there will be better options

    • 4 weeks ago
      Anonymous

      this and software always rises up to meet the power of a new hardware feature-set. ray tracing is only in its infancy, for example.

    • 4 weeks ago
      Anonymous

      There's a limit to how realistic something can look, some stuff done in UE5 looks like a real life recording

      • 3 weeks ago
        Anonymous

        that's just one aspect. there are many other ways to expand on a game's use of GPU. larger draw distance, more densely placed elements in a scene, higher quality shadows and lighting, etc. every front can be improved upon.

      • 3 weeks ago
        Anonymous

        then you want to run that realistic display at realistic framerates (above 24 or whatever consoles target) and with a bigger draw distance, then you want to actually have things actually happen in them on top of all of above instead of just having a scripted flyover
        that + all the ai shit (which is only going to become better and therefore more demanding) will make sure cpu will always be a dedicated component

  9. 4 weeks ago
    Anonymous

    I've been waiting on an AI on CPU chips for long while, sorta like video encoders/decoders that Intel has had for a while.

    A small scale AI inference for general consumer use is extremely viable if use/utilized to generate dynamic AI bots/TTS/speech recognition/etc.

  10. 4 weeks ago
    Anonymous

    We've already been there since around Skylake came out.
    Unless you're doing GPU related work or are a GaMeR you really don't need a dedicated card.

  11. 4 weeks ago
    Anonymous

    Not until we can do 100% fluid physics.

  12. 4 weeks ago
    Anonymous

    My 5600G runs all my games fine on its own

  13. 4 weeks ago
    Anonymous

    on laptops? it's already happening imo.
    on desktop never

  14. 4 weeks ago
    Anonymous

    >1080p only
    within a few years

  15. 4 weeks ago
    Anonymous

    Nvidia would never allow that.

  16. 4 weeks ago
    Anonymous

    If they can make an APU that gives 240fps on a 4k monitor then I'm good with them selling APU's & not selling dedicated GPU's.

    • 4 weeks ago
      Anonymous

      99% of people don't need 240fps on 4k. If APUs can do 1080p at 30 or 60fps (ie. a console), then 90% of GPUs are kill.

      • 4 weeks ago
        Anonymous

        then 144hz becomes standard then what

        • 4 weeks ago
          Anonymous

          Continue playing at 60fps. You can always just add a GPU but most people still wouldn't bother.

  17. 4 weeks ago
    Anonymous

    APUs are worth nothing without unified memory and very fast CPU interconnect.

  18. 4 weeks ago
    Anonymous

    Forever, due to gaming, and to a lesser extent machine learning - I say lesser here because there are a lot of ML applications that can run just on your CPU, and plenty of it is done on web services rather than locally, but very few games can run without a discrete GPU and cloud gaming sucks.

    • 3 weeks ago
      Anonymous

      Gaming has stagnated for the last decade. I can still play new games well enough with a gtx970. The only reason to upgrade now is for memes like 4k.

  19. 3 weeks ago
    Anonymous

    Soundcards effectively still exist.
    People just use USB for DAC.

    • 3 weeks ago
      Anonymous

      I don't think it's about whether they exist, but whether or not they're much more niche.

  20. 3 weeks ago
    Anonymous

    When gaming companies get their shit together and stop making games that require a high-end GPU to run while looking worse than shit that was released 10 years ago.
    So never.

  21. 3 weeks ago
    Anonymous

    AMD's APUs already made the poverty-tier 1030-esqe garbage obsolete and pointless. Making a 3050/3060 equivalent will need some drastic changes to the IO die because the current shit is bandwidth-starved to a comical degree. Either drop DDR altogether in favor of HBM or make a 6 or 8-channel memory controller tuned for frequency rather than latency, and then figure out how to package that abomination into a standard ATX form-factor.

  22. 3 weeks ago
    Anonymous

    dGPU will keep getting higher in price and therefore will be reserved for game streaming servers and AI trainning servers

    normies want low noise and big battery systems so they will only buy APU based system anyway, you can already see it with the m1 macbooks success

  23. 3 weeks ago
    Anonymous

    never. If amd wanted desktops with powerful integrated graphics they would already exist

  24. 3 weeks ago
    Anonymous

    Never, unless you start soldering GDDR to the motherboard. Desktop APUs will always be crippled by a lack of memory bandwidth.

  25. 3 weeks ago
    Anonymous

    if APU only means on-die dedicated GPU, probably never
    but general purpose hardware is now fast enough and "hardware acceleration" requirements common enough that it doesn't really make much sense to develop separate accelerators anymore, the only obstacle is the cost of moving the industry to a sane unified scalar/vector architecture like the one being developed at libre-soc
    https://libre-soc.org/3d_gpu/architecture/
    https://libre-soc.org/openpower/sv/
    for now the big players are content with keeping things as they are because they don't want to risk losing their advantage of having put a lot of R&D into the current paradigm but if one of them takes the risk and invests in true hybrid processors in order to gain a head start in their development, it's probably safe to say that dedicated GPUs would disappear completely

  26. 3 weeks ago
    Anonymous

    Tons of casual users use integrated graphics so I imagine they will be happy with an upgrade to APUs, but I can't imagine that anyone who utilizes a graphics card (gamers, people who render shit, people using AI) will be using APUs any time soon. You'd have to get to the point where APUs are at least as good as mid level graphics cards, which they are no where near. Also games keep getting more and more graphics intensive. So yeah I don't see them replacing GPUs anytime soon.

  27. 3 weeks ago
    Anonymous

    Never
    Memory is too much if a bottleneck

Your email address will not be published. Required fields are marked *