Why AMD over Nvidia?
Why AMD over Nvidia?
Falling into your wing while paragliding is called 'gift wrapping' and turns you into a dirt torpedo pic.twitter.com/oQFKsVISkI
— Mental Videos (@MentalVids) March 15, 2023
Why AMD over Nvidia?
Falling into your wing while paragliding is called 'gift wrapping' and turns you into a dirt torpedo pic.twitter.com/oQFKsVISkI
— Mental Videos (@MentalVids) March 15, 2023
Poor. There is no other reason. Even Nvidia works better on Linux.
nvidia limits you to 4 displays. display port supports splitters, so most amd carfs can run 7 or 8 monitors.
nvidia detects youre using a splitter and a message pops up "this gpu only supports 4 monitors."
absolutely pathetic that even decade old mid range amd cards support more monitors than new top of the line nvidias
wtf are you using 5+ displays for? I'm an autist who loves multi-monitors and even my limit is 3.
flight sims
>you're just poor that's why I get the 1060/2060/3060 instead of the 1080/2080/3080/4080!
I'd rather buy a premium 3060 over a dogsshit 6800.
>pay $100 more for a shittier product.
Yeah nah. I briefly looked at a 3060 and went for a 6700XT instead.
I have two PCs. One Nvidia (EVGA RTX 3080Ti) and One AMD (Sapphire RX 6800XT). They both cost the exact same price. Here's my experience:
>AMD card was much quieter than the Nvidia card while gaming
>Stability wise, the 6800XT crashed less than the 3080Ti (sometimes 3080Ti ran out of vram in 4k)
>For some reason; the HDR colors dont look as good on the Nvidia card as they do on the AMD card, no matter how many times I accurately calibrate it on my LG C1 OLED
>My power bill is higher when I use the Nvidia PC more than the AMD PC
RDNA2 was too fucking good.
>asus
lol lmao
Asus is the best
buenos dias pedro
Asrock is the best islandchinkshit for the money. Like the xfx of many years ago.
At least buy Sapphire if you're buying AMD
Sound like old men bitching about oil filter brands.
None of your business you dizzy cunt
Linux
No. Correctly set up Nvidia is better.
Proprietary drivers will never be correctly set up, Huang cocksucker
Proprietary drivers are always correctly set up. It's the rest of your meme rolling release distro that is wrong.
>Proprietary drivers are always correctly set up.
>always
Three years tops or whenever nvidia decides you need to top up your subscription.
Have fun on PoopOS because you are to dumb to install a proprietary driver by yourself, a driver which more then likely downtunes your GPU with updates just like with Windows.
To match the CPU
So I can maintain the illusion of choice and control over my hardware
Better drivers for Linux/VMs with passthrough.
>Better drivers for Linux/VMs with passthrough.
Not true outside of the brain of foss zealots, nvidia drivers have always been more stable and performant than other vendors on linux.
>passthrough
>amd
Shill bot
I'm getting a 7900XTX this weekend convince me not to
It's a $650 GPU LARPing as a $1,000 GPU. The market's busted. Don't be a pawn.
I will get a 4080 and you can't stop me
don't pretend to be me nigga, I'm deciding on either 7900XTX hellhound or 7900 XTX red devil
Just stay with whatever you have and lower settings.
Both suck.
Actually, no, the Hellhound is fine but the Red Devil has a shitty paste job from factory so the hotspot reaches stupid high temps. Also coil whine like a motherfucker.
If you're going XTX either get the Nitro (money to waste) or the MERC
nitro sadly doesn't have 3 DP 2.1s
Does not work with ROCm (which is also Linux only), useless for AI in other words
does it not?
I only care about rasterization and AI stuff is still at its infancy.
Rasterization refers to the process of comverting vector information into a bitmap image.
What you're thinking of is called Gaming Workload Without Ray Tracing.
no good reason, nvidia even has open source drivers now
AMD cards are priced nicely and just work.
>b-b-buh muh gaytracing!!11
Don't need it.
>gaytracing
It really is a meme. In most games there's very little visual benefit. Massive FPS loss is not worth it.
just cope with DLSS and make it look 10x worse with shimmering!
man those raytraced shadows sure look great behind all that noise and shimmering!
If you're poor and/or have an extra chromosome
I'm on Linux.
linux
Get whatever works best for your use case. Stop getting hung-up over fanboy non-sense.Neither company are your friends. Their only concern is their shareholder.
no reason
They just salty they can't generate 1000s of AI waifu.
No reason, I bought AMD after a long time 6700 XT and idk eh feels like I'm missing out on encoders and AI stuffs
where i live 3050 cost the same as an 6700XT
amd for basic gaming and work(office) its fucking amazing
im not giving israelitevidea 800 euros for a fucking 8gib vram card
>AI stuff
which runs fine on AMD cards dingus
a bunch of models work with DML, even more with IREE (which uses vulkan), and all of them work with HIP on linux
HIP itself works fine on windows, it's just that ROCm binary libraries are even more of a pain to build for it since their windows buildsystems are setup for people who have access to internal driver headers
if anything AMD cards have the advantage of not being fucking gimped by VRAM
honestly it gets clearer and clearer everyone posting about AI is either stupid, or an nvidia shill
seriously, how long has it been since /sdg/ first had guides for AMD cards up?
encoders are exactly the same on AMD as well
this is all just njudea shilling
>encoders are exactly the same on AMD as well
Nah, they're only 99% as good.
to be fair with that AMF is kinda a pain to use
but that might just be because i hate COM and i hate things trying to larp as COM
wish there was proper support for vaapi on windows
microsoft claims it works but vainfo shows nothing
Might anyone happen to have the link for the previous AMD thread? It came after the one below.
Got a 6600 for $230. First AMD card, like it a lot, replaced a 970 since I couldn't be assed with adapters/cords/hdmi version shit for a new monitor I got. Sadly though all the fun ai stuff is cuda so I'll probably get a 5070 whenever at comes out
cope
Linus switching to AMD days after securing a massive partnership with them once Intel dropped him?
Huh, weird.
>Intel dropped him
When?
Because they do most of their graphics work out of Toronto and I like to buy Canadian
Infinitely better drivers
Better performance
Better value
>Why AMD over Nvidia?
Cheaper, but shit computing software solutions.
Why do Nvidia boys always pretend amd cards are incapable of running a compute shader? Lol
they have to cpe with their overpriced cards
jfl 3060 still cost 400 euros
CUDA just werks.
AMD's solution requires hacks and effort.
idk i'm a graphics dev, from our perspective all you have to do is query the subgroup size and it will tell you how many threads you need to spawn in your compute shader for that gpu. That's hardly effort or hacks
Remember this: Most people here are just porn addicts looking to get off to endless porn.
stable diffusion, vegas pro, blender, all run great on my windows pc with a 6900xt, gpu acceleration just werks
the idea that AMD has any issues with compute in 2023 is a forced meme
>sony vegas pro
holy shit i didn't know you people still existed. why are you still using that software?
17 has everything a video editing suite could need and its whats familiar to me. I dont edit to make gay soi youtube content so AI color palates or whatever meme consoomer shit your shill house ass software you use is just not for me
Pic related the power of amd for ai. If you wanna try AI art you're better off with a 1660 super.
really good meme you made there with your 1660 anon
paid shills
>Why AMD over Nvidia?
I can't think of anything useful that AMD cards are better at except being the option of the budget conscious, or poor if you're dick measuring on LULZ. I was completely fine with my old 1650 but now have a 3060 in my rig because I keep convincing myself that I'll get around to CUDA programming and "Honey! I totally need this new card because its more powerful!"...but still haven't. I built a rig for my stepson centered around a 6600 and he's fine with it because he just needs something that is good for 1080 gaming and he's not an autismo that is gonna REEEEEE over minutiae.
>muh cuda
how many times do you stupid shills need to be told that literally everyone knows this is a lie at this point
OpenCL exists, Vulkan compute exists, OpenGL compute exists, DirectX compute exists, and AMD's clone of CUDA that literally does the same thing, HIP, exists
at this point some of you retards have gotten the right idea and are trying to lie slightly better, saying shit like
but slightly better than completely retarded is still retarded
>but slightly better than completely retarded is still retarded
Then explain how simple it is.
1/2
>OpenCL
build and install SDK, as it's been updated to include more helpers, copying folders to the right locations and setting up paths, create compute context compile kernel
optionally compile kernel outside of opencl api using clang for advanced CUDA-like features like GPU ASM, AMD doesn't support SPIR-V as there's no point when you can just target the GPU's ISA
>OpenGL compute
build and install a opengl loader library, create graphics context compile shader
>DirectX pre-12 compute
SDK ships with windows SDK
create device??? and other shit?? and compile shader (directX is gay)
>Vulkan compute
install SDK, or build it if you want a version not shipped by LunarG
compile shader outside of application using glslc
create instance, create device (creating queues), create command pools, create compute pipeline, etc. etc.
vulkan initialization is a pain but it's all boilerplate
>DirectX 12
SDK ships with windows SDK
create device??? compile shader separately? presumably similar shit to vulkan init (directX 12 is not any less gay)
Dead bolts are better, stronger, and faster.
2/3
this ended up being longer than i thought as i wanted to make a detailed install guide for the unofficial windows setup for any AMD bros out there
>HIP
on linux, run the install script/download from AUR
>>but but that recent blogpost we were shilling where some retard tried to install ROCm on an unsupported system and the install script didn't work
ROCm only supports certain hardware on certain versions (
is an idiot, the current version supports multiple GPU ISAs, i forget how far back it goes, but the hipcc's compiler (unneeded) default offload target selection is probably it)
on windows
as there's no "normal" prepackaged SDK you either use orochi (officially supported install, what blender uses) which is a wrapper over cuda's runtime compilation API/HIP's runtime compilation API and works similarly to things like OpenCL in both CUDA and HIP and requires only the basic setup from a normal C/C++ library install
or you use the unsupported single static binary approach like normal HIP programs on linux and CUDA programs
this is a much more detailed guide due to being unsupported even if it is obvious for anyone who knows anything about HIP's architecture
the following takes place of installation and further setup
>normal toolchain setup stuff which has comparable parts in or are prereqs for the other installations
install clang/llvm toolchain for the compiler
make SDK root dir
clone and build https://github.com/RadeonOpenCompute/ROCm-Device-Libs
install to SDK root dir
clone https://github.com/ROCm-Developer-Tools/HIP
clone https://github.com/ROCm-Developer-Tools/hipamd
plop include folders from both repos in SDK root dir/include
>the only vaguely hacky part
create HIP version header, hip_version.h in SDK root dir/include, defining HIP_VERSION_MAJOR, HIP_VERSION_PATCH, HIP_VERSION_GITHASH, and HIP_VERSION, HIP_VERSION_BUILD_ID, HIP_VERSION_BUILD_NAME, and __HIP_HAS_GET_PCH
right now they should be 5,4,22971, cf12a8e4, 0ULL, "", 50422970, and 0
i made a tool in lua to dump the runtime values using info from the HIP headers, you could easily do this as a binary program
dump export table of amdhip64.dll and make an export library in SDK root dir/include using llvm-dlltool
>more normal shit
set include/path/lib environment variables
compile using command line specified here as a base
(you should know what's not needed and what is, use a config file and a symlink, i do, that picture is just an example, my actual invocation is much simpler)
>but but nvcc makes compiling simpler
so build hipcc, it's just a wrapper over clang that uses older defaults for that command line
doesn't need anything special to build, just a few env vars to run
header only ROCm libs work as long as you do get their version from cmake, binaries ones don't due to fucked build systems that can be hacked into working (i got rocblas working without tensor support due to the tensor library being created by a stupid fucking python script that made cmake debugging hard)
>but but that means my AI coomer libraries/programs don't work
and?
we're not talking about using HIP for AI, which has other solutions on windows
this is for developers, which you aren't if you're just an AI coomer
>>but but that means my AI coomer libraries/programs don't work
>and?
>we're not talking about using HIP for AI, which has other solutions on windows
>this is for developers, which you aren't if you're just an AI coomer
My point. No CUDA = no full value.
>it's really simple
>you just have to use this unofficial hack
you are wasting your time they don't care
I have a 1080ti and a gigantic savings account, i'm ready to upgrade. From what I can tell the 4080 and 7900xtx trade blows but the xtx is cheaper, I don't care about ray tracing, but 24GB of vram would be very nice for some gigantic acceleration structure graphics dev ideas I have
>>you are wasting your time they don't care
i know
someone who wants to do GPGPU on an AMD gpu might though
my goal is to spread knowledge even if no one listens
but no one asked how to hack a solution with AMD. everyone interested in ML already has nvidia.
if all you are is a worthless AI coomer it doesn't matter to you whether or not the pytorch backend you're using is CUDA, HIP (which literally just works on linux), or Microsoft's DirectML
it doesn't even matter to you retards if it's pytorch at all, it could be IREE, which uses vulkan compute, and works on AMD cards regardless of system, uses less resources, and runs on the much faster python 3.11
face it
you literally have no idea what you're talking about
>HIP (which literally just works on linux)
With a very tiny subset of old hardware. CUDA works on a wide range of it, works well, and doesn't require esoteric commands pulled off some janky website.
>1080ti
Lower settings and have more money. If you actually earned that money you would not want to buy a depreciating asset in a bad market.
>depreciating asset
it's a graphics card not a fucking house, i'm not trying to 10x my money I just want to rasterize shit faster at high refresh rates
any real reasons not to get this card? like catastrophic ones?
It's a waste of money. Lower settings. Stop consooming.
More prettyful grafix will not improve your miserable existence.
I'm not asking for financial advice. Going to microcenter this weekend baby I've skipped enough generations
Lower settings. You hardly play any video games anyway.
1080 ti does not cut it if you waant to gayme at over 100 fps
It's good enough when you lower settings. I have a 3080 and I will be lowering settings until I can't get a stable 60 FPS @1440p (native monitor resolution).
the aforementioned issue with HIP not working on RDNA3 (which means no acceleration for modern blender versions since they dropped opencl after nvidia bribed them) and those weird issues the linux drivers are having
you just proved you have absolutely no idea what you're talking about
first page and second has zero workarounds and is how every single person who isn't a CUDAlet including game engine developers programs for the GPU
Ok thank you for providing a real issue, my focus is real time rendering anyway. I do use Blender but I'm not too impacted by that.
I am running into this as well. Can't hit 144fps reliably anymore
you can use https://github.com/KomputeProject/kompute btw you don't need CUDA
kompute is a bit immature
i don't even think they have support for VK_KHR_buffer_device_address yet which should be critical for something like this
and i don't know if they're doing enough to mitigate a lot of the (necessary in many cases) annoyances of rapid iteration using vulkan
google's IREE is fine for ML at least
Workarounds and hacks. Got it.
Nah cuda just has a way better ecosystem
>lie slightly better, saying shit like
>>CUDA just werks.
>>AMD's solution requires hacks and effort.
How it is a lie? AMD being a not 'just works' solution should lower their product prices to stay competitive, imho.
because the "hacks" you're talking about are comparable to running an installer (and are running an installer on linux) or basic SDK build processes
they aren't hacks, they're normal developer shit anyone who actually programs does multiple times an hour with the sole exception of making an export library for a DLL that doesn't have one which still happens fairly often
fucked
it works on plenty of generations of GPU, just not the latest
or the officially supported way which is how blender does it
and this isn't a hack, this is doing what AMD refuses to do, release files for a windows SDK
what is the status of ROCm/HIP on the 7900 xtx?
Doesn't work.
Which is actually hilarious. AMD made a compatibility layer that worked on exactly One generation of GPU. Classic AMD software.
Is this post sarcasm? I was nearly convinced by the earlier poster that going amd is alright now.
Actually, I lied. It could be hacked to work on Polaris/Vega/RDNA1. But not RDNA3. Officially, it only supports RDNA2.
>hacked to work on .../Vega
Vega10 owner here, it works fine.
>stepson
amd works better on linux
>inb4 proprietary nvidia drivers
Linux
Why u niggas even care about PC specs nowadays? So you can brag about running Hogwarts in 4K? lol lmao
Ummm i have an intel ARC card. It plays all muh gaymes extremely well and likely has more VRAM than ill ever use (16gb). Not really an rgb fan but i like the blue and purple default color sceme.
Only downside is lackluster Linux support but thats supposed to get better with the next kernel update so i can wait a bit to go back.
Either way, i like my GPU. I wish you guys liked your GPUs too...im very sorry youre all so pissed off about your purchases...
They do the same thing. It's like comparing deodorant brands or some shit
Except the green deodorant bottle doesn't make you solve a Rubik's cube every time you want to use it.
So nobody likes the a770? Why give pluton company money if thats around?
>Why AMD over Nvidia?
because if you don't need gay tracing and other shit, they scale better with AMD CPUs
4090 clearly crushes the competition here.
yes, but 7900XTX works way better on AMD compared to Intel
It's also way more expensive ?
What's the cpu used in the baseline 3080ti config?
13700k
and 7700x for AMD, it's just there on the graph
really do I need to consoom and change my Intel CPU to AMD for fucks sake.
AMD isn't without sin either but until they reach ngreedia tiers of bad I'll never switch. I've never experienced any of the complaints people had with AMD either. AMD works fine for me.
Because I use Linux.
gayming -> amd
just werks -> nvidia
simple as
you fucking retard, first stop reading reddit then neck yourself
maybe you should take your own advice, amd poorfag
Serious question, what do you do on AMD if you just wanna stream a game to a friend or two? Set up OBS and stuff? Kinda sucks that a lot of applications support nvenc encoding but not AMF
right click desktop -> open amd control panel -> record & stream
damn, that easy?
And does that use the AMF encoding and stuff with good quality?
yeah its good enough for twitch and youtube. my nvidia friends streams actually look worse because they set it up wrong with OBS or something while with amd its all auto configured mostly.
I can get an Asus Dual Radeon 6700XT for £387 where I am. Worth it for an upgrade from a 970 I should I wait for le 4070/4060 releases?
Imagine being a retarded loyal brandwhore instead of just buying the best option of each generation lmao
You homosexuals act as if Jensen or Lisa sends you a personal thank you note and a $50 gift card for sticking with le red or green card each build and defending their companies FOR FREE on an autistic imageboard. They don't even know you exist nigga
>he doesn't know about Team Red discount
Lisa literally gives out $50 gift cards for sticking with le red and defending their companies.
Hmmm I will consider it if she gives a blowjob instead.
>800 buckerinos for a 4070 which is just a 4060 in reality
We are clearly past the point of reason my dear anon. It's fucking over.
amd is only good if you're a poorfag gaymer. if you need to use any productivity software like blender or premiere, nvidia will completely ass blast amd in performance
Because I don't like sucking dicks.
Mm, quite simple actually, never had any issues with Nvidia/Intel stuff in the past years so I just don't want to take risks and try modern AMD/Radeon (and I did have my share of problems with them back in 00's)
6700 xt is cheaper than 3060 where I live.
the fucking RTX 3050 garbage cost the a little bit more than an 6700 non XT where i live
im sorry for green bros unless you have the money for an actual 4090
1. Way smaller TDP. My needs aren't huge, I don't really need a GPU that uses more than 54W in full speed.
2. GPL drivers.
3. Not Intel.
Never
Despite AMD showing superiority in pure raster when it comes to everything else they still suck. It could be remedied with decent pricing but even that they fucked up.
>superiority in pure raster
which gpu ?
I got a 6900xt for $599. Best purchase I've ever made in tech.