just bought this
what am i in for?
same price as 3080s and 3080tis
performs similar to the 3090 in a lot of cases
think ill buy a 4090 if this doesnt satisfy me
4070Ti meta?
Falling into your wing while paragliding is called 'gift wrapping' and turns you into a dirt torpedo pic.twitter.com/oQFKsVISkI
— Mental Videos (@MentalVids) March 15, 2023
Wrong board
Buy either the 7900 or the 3090 because the anaemic 4070 VRAM is a total scam.
VRAM doesn't matter above 10GB, unless you're using a shitty AMD GPU. Which is probably why a lot of AMDrones whine about VRAM, their shitty cards piss away 3GB of it for no reasons.
>VRAM doesn't matter
sit down you fucking child, you clearly don't do ML/AI
Anyone doing ML/AI is renting time on Colab or has access to a facility with A100/H100 available. Sorry, that 24GB is right in e-peen range, too high for gaming, too low for ML/AI.
that's nice shlomo
not renting shit
So just buy an A100 80GB. You're poor? It's only $8000.
made me look it up
you can get a 40gb for $3k-4k, considering it
you need rack cooling tho, and each 3090 is like $800 and can be nvlink'd, probably more cost effective
nvlink doesnt work for AI
You’re talking to someone whose handled various phase one backs, don’t temp me.
>give us your data goy
>AMD
>AI
Kek joke fren, still no ROCm drivers for RDNA3, which is also Linux only
>VRAM doesn't matter above 10GB
you're on the wrong board
Ever heard of a certain wizard game perhaps?
>Ever heard of a certain wizard game perhaps?
Heh. My wife is playing that one on a mobile 3050 with 4GB of VRAM.
Runs totally fine btw.
Wizard game is a soicode outlier.
couldn't you have gotten a 7900xt with a little bit more ?
also if you could afford it go for the 4090
>12gb
>192bit
lol lmao
i dont have a 4k monitor
doesnt really matter
amd shit for vr
Should have bought AMD.
>b-but w-what about
Doesn't matter.
amd gpu bad for vr
I've read that the smaller bus makes the 4070ti bad for VR. I bought a 3080 a month before the 4070ti came out. I am happy with how it performs in VR.
guess im finding out the hard way
I bought 7900, I returned it in a week and got a 4080, no ROCm drivers, couldn’t run AI on even Linux.
AMD is for gaymen only. Anything else get nvidia. I suppose AMD matches nvidia in Davinci resolve, but if you’re into that or start using you may end up expanding into other stuff where AMD’s awful or doesn’t work at all.
If you’re only gonna game it’d be fine.
vramlet lmao
perfect amount for what i do, wont need more for a very long time
>israeliteing yourself out of vram, the only resource there is left ion the world
LMFAO RETARD
>what am i in for?
Over paying by about 150$
Not that bad if you can afford it
not a problem
NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO
it performs similarly to a 3090, but has half the vram
a 3090 can be had for the same price on ebay
>$829.99
>192-bit memory bus
you aren't changing anything by posting this fatty mcdoodoo
My 1080ti died, was looking into a 3080ti or a 6800xt. Leaning towards amd because its affordable and I dont have to deal with nvidia experience login shit.
Can the 6800 do stable diffusion? I do this with my 1080 and have no issues
Can the 6800 run games decently at 2k resolutions? Please help i'm stuck phone fagging until its replaced
>Can the 6800 do stable diffusion?
Very annoying to setup, required Linux, performance is comparable to 2070.
>Can the 6800 run games decently at 2k resolutions?
Yea, it's better at low resolution (i.e. below 4K).
why would you niggas reply to him
why wouldn't I reply to him
Crap, I use stable diffusion to make characters for dnd campaigns I run. I cant use stuff in that general cause they are too lewd. Should I just look at a 3070, I dont run linnux so that rules that route out. I run 2 - 2k monitors, and played games in 2k on 1 screen.
>people like this exist
I thought the new amd cards 6000 series and above could do ai stuff? Did you have any noticeable issues with video processing with the amd stuff? Honestly contemplating dropping this a.i. stuff and going amd cause they seem a hell of alot more affordable, but my 1080 has lasted me like 5 years before it bit the dust this week
by relatively slow I don't mean it's unusable, but a similarly priced nvidia card (turing or newer) would probably be faster
7000 series has dedicated AI accelerating hardware, similar to nvidia's tensor cores, unfortunately all of the common AI libraries use CUDA as a backend, and AMD's compatibility API (HIP) does not support the 7900
the 7900 performs well with other API's, but there are way less people working on it, you are at the mercy of AMD's perpetually tardy development cycle
there are bindings for torch and nearly everything else, they're just not built into the common ai programs people are running
memory pooling doesn't work, if that's what you mean, you still have to manage it
I was just considering the 6800 series because it seems to be the logical upgrade from a 1080 in terms of price and performance. I just heard mixed things of it being able to do stable diffusion of windows and thats fhe hold back. Right now it takes about 15 seconds or so for my 1080 to gen a a.i. image, and its not bad considering I just use it for table top character pucks. So speed isnt a issue. I'm not a linnux user so i'm not sure if the 6800 would even work for the way I use stable diffusion. My friends use a 6800 for games and video processing and it seems to work fine though. But i'm against a wall here with no functioning gpu
>windows
Absolutely not. It does not unless it’s nmkd gui with Onnx models… the options are extremely basic… tbh it is bad. Real bad. You want it to be fully featured and run auto1111, invokeai, comfyui etc.
I got 5 it/sec out of a 7900 xt too on that very basic 512x512 nmkd gui vs 23 it/sec on auto1111 with a 4080.. speed will drop with more complex stuff and increased res but yeah.
6800 will have to run Linux and rocm drivers and some fiddling around and crossing of fingers to get auto1111 working with gpu.
AMD is just not there yet for AI. The Linux solution is for people who already have an AMD, looking at a new card? nVidia. Hence I returned mine and got a 4080.
>5 it/sec out of a 7900 xt t
wtf. I'm getting ~8 it/sec on the a770`
It was in nmkd gui with an Onnx model. Not a safe tensors via normal PyTorch etc. the control you have, functions, results and potential are pretty garbage too tbh.
A 6800 non XT is around 6 too with ROCm on Linux. Both windows and 7000 series are off the table atm.
Other option is nod-ai but it looks very basic and limited compared to what can be done such as with auto1111 etc
https://nod.ai/sd-rdna3-ces2023/
RDNA3 (7000 seriess) has no driver support. It’d on AMDs rocm github… no support until version 5.5, on top of that it’s Linux only… allegedly they’re bringing it to windows but I bet the 7900 xtx vs 4080 will perform similar how they do in blender in this case tbh. It’s such a shame.
>Can the 6800 do stable diffusion?
yes, but it's relatively slow due to lacking matrix multiplication hardware
>Can the 6800 run games decently at 2k resolutions?
benchmarks are readily available elsewhere
My 980ti is working fine .