Sound is an easy workload, with graphics they're always pumping up the resolution, the visual fidelity and introducing harder workloads so who the fuck knows.
not any time soon. Gamers love their eye candy and graphics are embarrassingly parallel, which means the speed you can get is more or less a function of the power you're willing to draw (and dissipate), which is way way more limited in an APU than a dGPU. Oh yeah and now we have AI; a non-gaming GPU workload that tons of ordinary people actually want to run. And those people care about making it fast, bumping into the same power-dissipation problem.
if you aren't doing either of those two things you haven't needed a dGPU for like ten years now, unless you just need more outputs.
Lots of gamers don't care about having the best graphics as long as the game will run. As for Ai, most people do that online using other people's computers so local performance doesn't matter there anyway.
>Lots of gamers don't care about having the best graphics as long as the game will run.
nope, unless youre doing le indie retro game or something for a very specific niche, graphics are pretty much always the main selling point right after the ip the game is based on
It doesn't solve a problem so never
Market segmentation means not likely. For simple example:
If I want power then GPU brand
If I want form factor chromebook
If I want streamlining Apple
Apple can't apu because they are proprietary.
Chromebook can't apu because ironically higher power and manufacturing cost requirements.
A gpu laptop well, it's a scam but the value proposition is still worse because it sounds like integrated graphics
Probably never because most cheap consumer motherboards are geared toward ~100-200W of power delivery on the CPU socket which has to be shared between the CPU and GPU. Thus you can either max out CPU or GPU performance but not both concurrently while gaming so a bottleneck will always persist. When you look at console frametimes you'll notice that only half the games are able to consistently maintain 60 FPS as 1% lows.
Half of LULZ kept creaming themselves speculating on some kind of alien technology from the future that would allow the PS5 APU from accessing hundreds and hundreds of watts of power delivery while gaming and thus enable it to compete with $1,000 PC builds at the time. In reality less than 200 watt power consumption is observed which explains picrel as the CPU and GPU fight over power.
they won't turn into sound cards, the reason is that sound cards got as good as they ever need to be 20-25 years ago, like there's just no way to make them sound better, and nothing does effects or mixing on sound cards since 15-20 years ago either, they're just a glorified buffer+dac+amp like the old days, only now even the cheapest parts do transparent quality no prob
there's simply no room to make better sound cards, they're finished
graphics cards on the other hand, there's no end in sight regarding making those better, and where there's a way to make one better, there will be better options
that's just one aspect. there are many other ways to expand on a game's use of GPU. larger draw distance, more densely placed elements in a scene, higher quality shadows and lighting, etc. every front can be improved upon.
then you want to run that realistic display at realistic framerates (above 24 or whatever consoles target) and with a bigger draw distance, then you want to actually have things actually happen in them on top of all of above instead of just having a scripted flyover
that + all the ai shit (which is only going to become better and therefore more demanding) will make sure cpu will always be a dedicated component
Forever, due to gaming, and to a lesser extent machine learning - I say lesser here because there are a lot of ML applications that can run just on your CPU, and plenty of it is done on web services rather than locally, but very few games can run without a discrete GPU and cloud gaming sucks.
When gaming companies get their shit together and stop making games that require a high-end GPU to run while looking worse than shit that was released 10 years ago.
So never.
AMD's APUs already made the poverty-tier 1030-esqe garbage obsolete and pointless. Making a 3050/3060 equivalent will need some drastic changes to the IO die because the current shit is bandwidth-starved to a comical degree. Either drop DDR altogether in favor of HBM or make a 6 or 8-channel memory controller tuned for frequency rather than latency, and then figure out how to package that abomination into a standard ATX form-factor.
if APU only means on-die dedicated GPU, probably never
but general purpose hardware is now fast enough and "hardware acceleration" requirements common enough that it doesn't really make much sense to develop separate accelerators anymore, the only obstacle is the cost of moving the industry to a sane unified scalar/vector architecture like the one being developed at libre-soc
https://libre-soc.org/3d_gpu/architecture/
https://libre-soc.org/openpower/sv/
for now the big players are content with keeping things as they are because they don't want to risk losing their advantage of having put a lot of R&D into the current paradigm but if one of them takes the risk and invests in true hybrid processors in order to gain a head start in their development, it's probably safe to say that dedicated GPUs would disappear completely
Tons of casual users use integrated graphics so I imagine they will be happy with an upgrade to APUs, but I can't imagine that anyone who utilizes a graphics card (gamers, people who render shit, people using AI) will be using APUs any time soon. You'd have to get to the point where APUs are at least as good as mid level graphics cards, which they are no where near. Also games keep getting more and more graphics intensive. So yeah I don't see them replacing GPUs anytime soon.
Sound is an easy workload, with graphics they're always pumping up the resolution, the visual fidelity and introducing harder workloads so who the fuck knows.
So turn down the settings.
RETARD.
>Sound is an easy workload
wrong
not any time soon. Gamers love their eye candy and graphics are embarrassingly parallel, which means the speed you can get is more or less a function of the power you're willing to draw (and dissipate), which is way way more limited in an APU than a dGPU. Oh yeah and now we have AI; a non-gaming GPU workload that tons of ordinary people actually want to run. And those people care about making it fast, bumping into the same power-dissipation problem.
if you aren't doing either of those two things you haven't needed a dGPU for like ten years now, unless you just need more outputs.
The majority of gamers use an APU, whether it be their laptop or Console.
Trannies don't count
Havent you noticed PC gamers have the elevated rate of trannies? Console and laptops are the normies.
We are already there
Lots of gamers don't care about having the best graphics as long as the game will run. As for Ai, most people do that online using other people's computers so local performance doesn't matter there anyway.
>Lots of gamers don't care about having the best graphics as long as the game will run.
nope, unless youre doing le indie retro game or something for a very specific niche, graphics are pretty much always the main selling point right after the ip the game is based on
Seeing graphics in trailers activates monkey neurons and makes people want to play it but 90% of people will what up playing it on low/medium anyways
It doesn't solve a problem so never
Market segmentation means not likely. For simple example:
If I want power then GPU brand
If I want form factor chromebook
If I want streamlining Apple
Apple can't apu because they are proprietary.
Chromebook can't apu because ironically higher power and manufacturing cost requirements.
A gpu laptop well, it's a scam but the value proposition is still worse because it sounds like integrated graphics
So never?
Probably never because most cheap consumer motherboards are geared toward ~100-200W of power delivery on the CPU socket which has to be shared between the CPU and GPU. Thus you can either max out CPU or GPU performance but not both concurrently while gaming so a bottleneck will always persist. When you look at console frametimes you'll notice that only half the games are able to consistently maintain 60 FPS as 1% lows.
Half of LULZ kept creaming themselves speculating on some kind of alien technology from the future that would allow the PS5 APU from accessing hundreds and hundreds of watts of power delivery while gaming and thus enable it to compete with $1,000 PC builds at the time. In reality less than 200 watt power consumption is observed which explains picrel as the CPU and GPU fight over power.
I think professionals will be using discrete GPUs for a while.
Whenever cloud/streaming becomes the "standard" games are delivered by.
It'll happen at some point.
they won't turn into sound cards, the reason is that sound cards got as good as they ever need to be 20-25 years ago, like there's just no way to make them sound better, and nothing does effects or mixing on sound cards since 15-20 years ago either, they're just a glorified buffer+dac+amp like the old days, only now even the cheapest parts do transparent quality no prob
there's simply no room to make better sound cards, they're finished
graphics cards on the other hand, there's no end in sight regarding making those better, and where there's a way to make one better, there will be better options
this and software always rises up to meet the power of a new hardware feature-set. ray tracing is only in its infancy, for example.
There's a limit to how realistic something can look, some stuff done in UE5 looks like a real life recording
that's just one aspect. there are many other ways to expand on a game's use of GPU. larger draw distance, more densely placed elements in a scene, higher quality shadows and lighting, etc. every front can be improved upon.
then you want to run that realistic display at realistic framerates (above 24 or whatever consoles target) and with a bigger draw distance, then you want to actually have things actually happen in them on top of all of above instead of just having a scripted flyover
that + all the ai shit (which is only going to become better and therefore more demanding) will make sure cpu will always be a dedicated component
I've been waiting on an AI on CPU chips for long while, sorta like video encoders/decoders that Intel has had for a while.
A small scale AI inference for general consumer use is extremely viable if use/utilized to generate dynamic AI bots/TTS/speech recognition/etc.
We've already been there since around Skylake came out.
Unless you're doing GPU related work or are a GaMeR you really don't need a dedicated card.
Not until we can do 100% fluid physics.
My 5600G runs all my games fine on its own
on laptops? it's already happening imo.
on desktop never
>1080p only
within a few years
Nvidia would never allow that.
If they can make an APU that gives 240fps on a 4k monitor then I'm good with them selling APU's & not selling dedicated GPU's.
99% of people don't need 240fps on 4k. If APUs can do 1080p at 30 or 60fps (ie. a console), then 90% of GPUs are kill.
then 144hz becomes standard then what
Continue playing at 60fps. You can always just add a GPU but most people still wouldn't bother.
APUs are worth nothing without unified memory and very fast CPU interconnect.
Forever, due to gaming, and to a lesser extent machine learning - I say lesser here because there are a lot of ML applications that can run just on your CPU, and plenty of it is done on web services rather than locally, but very few games can run without a discrete GPU and cloud gaming sucks.
Gaming has stagnated for the last decade. I can still play new games well enough with a gtx970. The only reason to upgrade now is for memes like 4k.
Soundcards effectively still exist.
People just use USB for DAC.
I don't think it's about whether they exist, but whether or not they're much more niche.
When gaming companies get their shit together and stop making games that require a high-end GPU to run while looking worse than shit that was released 10 years ago.
So never.
AMD's APUs already made the poverty-tier 1030-esqe garbage obsolete and pointless. Making a 3050/3060 equivalent will need some drastic changes to the IO die because the current shit is bandwidth-starved to a comical degree. Either drop DDR altogether in favor of HBM or make a 6 or 8-channel memory controller tuned for frequency rather than latency, and then figure out how to package that abomination into a standard ATX form-factor.
dGPU will keep getting higher in price and therefore will be reserved for game streaming servers and AI trainning servers
normies want low noise and big battery systems so they will only buy APU based system anyway, you can already see it with the m1 macbooks success
never. If amd wanted desktops with powerful integrated graphics they would already exist
Never, unless you start soldering GDDR to the motherboard. Desktop APUs will always be crippled by a lack of memory bandwidth.
if APU only means on-die dedicated GPU, probably never
but general purpose hardware is now fast enough and "hardware acceleration" requirements common enough that it doesn't really make much sense to develop separate accelerators anymore, the only obstacle is the cost of moving the industry to a sane unified scalar/vector architecture like the one being developed at libre-soc
https://libre-soc.org/3d_gpu/architecture/
https://libre-soc.org/openpower/sv/
for now the big players are content with keeping things as they are because they don't want to risk losing their advantage of having put a lot of R&D into the current paradigm but if one of them takes the risk and invests in true hybrid processors in order to gain a head start in their development, it's probably safe to say that dedicated GPUs would disappear completely
Tons of casual users use integrated graphics so I imagine they will be happy with an upgrade to APUs, but I can't imagine that anyone who utilizes a graphics card (gamers, people who render shit, people using AI) will be using APUs any time soon. You'd have to get to the point where APUs are at least as good as mid level graphics cards, which they are no where near. Also games keep getting more and more graphics intensive. So yeah I don't see them replacing GPUs anytime soon.
Never
Memory is too much if a bottleneck