>those RTX 4000 prices

>those RTX 4000 prices

Thanks, Mr. Leather Jacket Man, but we'll take the GPU market from here.

Beware Cat Shirt $21.68

Rise, Grind, Banana Find Shirt $21.68

Beware Cat Shirt $21.68

  1. 2 years ago
    Anonymous

    It's HER turn! Bring on the RED POWER, girl!

    • 2 years ago
      Anonymous

      >her
      It's Satoru Iwata after faking his death and transitioning.

      • 2 years ago
        Anonymous

        Lmfao

      • 2 years ago
        Anonymous

        Lacist.

      • 2 years ago
        Anonymous

        What are you talking about?
        Jensen and Lisa are the exact same person.
        The jacket works wonders for swag

        • 2 years ago
          Anonymous

          they're closely related
          https://www.techtimes.com/articles/253736/20201030/fact-check-nvidia-ceo-uncle-amds-dr-lisa-su.htm

        • 2 years ago
          Anonymous

          you do know they're blood relatives, right? they're literally from the same family. also, what are the odds of that?? how come two people from the same family both get jobs in the same type of companies? how does this happen? nepotism?

          • 2 years ago
            Anonymous

            >how does this happen? nepotism?
            Lol. Just good genes. Lisa su got big in IBM them moved to AMD for CPUs. Even in IBM she was pushing for chiplet design.
            That's why GPUs have only now started to get competitive. Once AMD picked up Radeon they brought it down to shit. Now they're working it up.
            Lisa Su actually got to her position via her big brain. She literally saved AMD

          • 2 years ago
            Anonymous

            The HD7970 was one of their best cards ever

          • 2 years ago
            Anonymous

            >HD7970 released 2011
            >ATI - 2006 (acquisition completed); 2010 (branding phased out)
            HD7970 was the beginning of the end and they bleed talent or atleast innovation from then on.

      • 2 years ago
        Anonymous

        kekw

        • 2 years ago
          Anonymous

          go back

      • 2 years ago
        Anonymous

        you know, when people "transition", they usually don't change their entire face. i know the meme of all chinks looking alike is a funny one, but 900-come-on-now...

        • 2 years ago
          Anonymous

          >talking about Korean and Japanese people
          >calling them chinks

          • 2 years ago
            Anonymous

            they arr the same

          • 2 years ago
            Anonymous

            >Lisa Su
            >Korean

          • 2 years ago
            Anonymous

            all slanty-eyed people are chinks, bruh. don't start getting smart with me.

      • 2 years ago
        Anonymous

        left is jomon, right is yayoi.

      • 2 years ago
        Anonymous

        Wait so they're not the same person?

    • 2 years ago
      Anonymous

      Do you think her feet been polished with stones? Do she smell like lilac n shit?

  2. 2 years ago
    Anonymous

    >Here's your new Radeon, Anon

    • 2 years ago
      Anonymous

      Still better than 4090.

    • 2 years ago
      Anonymous

      ah, so it's a condom for my futanarigf

    • 2 years ago
      Anonymous

      Come on bro!
      That was clearly photoshopped to make it look smaller with less fans.

    • 2 years ago
      Anonymous

      LARGEEEEEEERRRRRR.

      It's not smart, it's what you do when you have an inferior product. Shave $100 off the price and the hapless drones would eat it up. But now even amdrones would probably buy cheap amperes/moronna2.

      >It's not smart, it's what you do when you have an inferior product. Shave $100 off the price and the hapless drones would eat it up.
      It's exactly what Sony did in 2013 iirc and it worked marvelously. And they did have the superior product. They basically killed MS that day.

    • 2 years ago
      Anonymous

      Who would win in battle?

  3. 2 years ago
    Anonymous

    It was smart for AMD to let Nvidia go first. It gives AMD the advantage of being able to readjust their RX 7000 prices in advance.

    • 2 years ago
      Anonymous

      It's not smart, it's what you do when you have an inferior product. Shave $100 off the price and the hapless drones would eat it up. But now even amdrones would probably buy cheap amperes/moronna2.

      • 2 years ago
        Anonymous

        That's fricking moronic and you're fricking moronic.

        If you have the inferior product then you want to get out ahead and start selling them before your competition lets everyone know they've got a better product. Then all you can do is slash prices, but in the meantime you've at least made some sales at full price.

        • 2 years ago
          Anonymous

          Which is exactly what it seems like NVIDIA is doing. Zero solid numbers given just a hand wavey 2-4x (under extremely specific conditions) my guess is actual performance is probably ~30-50% better when not relying on DLSS or RT

  4. 2 years ago
    Anonymous

    When?

    • 2 years ago
      Anonymous

      RDNA 3 reveal is on Nov 3

  5. 2 years ago
    Anonymous

    Just release an efficient rasterization card for 300-400. They'd gain an absurd amount of market share.

    • 2 years ago
      Anonymous

      300 bucks. 3060 Ti performance. All it would take, but no one can or wants to do it.

    • 2 years ago
      Anonymous

      300 bucks. 3060 Ti performance. All it would take, but no one can or wants to do it.

      Bro the 3060ti should only have cost $270. And that was 2 years ago.

      • 2 years ago
        Anonymous

        The brass ball of Nvidia to announce a new product line and at the same time market last gens cards too. Listing the same 3060 MSRP as it had at launch 18 months ago. This is what happens when a company has too much market share

        • 2 years ago
          Anonymous

          True. But Nvidia is also confident AMD won't have anything to match and from the presentation and advances with ray tracing and dlss I can see why Nvidia thinks they can continue to overcharge.
          It's really solid software and tech AMD in no way can match up too.
          If you want to best GPU and feature set it's still Nvidia. And they know you'll pay for it even through there are millions of used GPUs on the market

  6. 2 years ago
    Anonymous

    Everyone is now hoping AMD will deliver, but it will be, like always, Another Massive Disappointment.

    • 2 years ago
      Anonymous

      RDNA2 was great. AMD has consistently been delivering since 2019, or even arguably 2017.

      • 2 years ago
        Anonymous

        >RDNA2 was great.
        >Lost to 30 series in Ray-tracing
        >tried to shill FSR but got fricked and tried to shill 2.0
        IF 7000 is going to be better than 4000 in RT and will have something close to DLSS 3, maybe it will be great

        • 2 years ago
          Anonymous

          >Lost to 30 series in Ray-tracing
          No one cares. You don't play video games. RT adds no value. RDNA2 won where it matters: Price to performance, power efficiency and VRAM.

          • 2 years ago
            Anonymous

            is that way AMD's marketshare is in the gutter?

          • 2 years ago
            Anonymous

            true

        • 2 years ago
          Anonymous

          >lost to Novidya in something that's an explicit Novidya gimmick and an AMD afterthought
          Enjoy managing a simulation of your Amazon warehouse or whatever, Jensen.

        • 2 years ago
          Anonymous

          >muh ray tracing
          Rumao
          Ppl still care about this shit?
          >omg guis 10% better shadow with half the frame rate

          • 2 years ago
            Anonymous

            Ray tracing destroys performance of games, it's pointless to enable ray tracing if you arent paying 2k dollars for a gpu (and at that point you would want to play at 4k, which the current gpus cant handle with rtx enabled either way).
            The only reason to buy nvidia now is if you want their meme AI garbage

            >Gaytracing
            why yes i'd like to halve my framerate for a gimmick

            Cope.

        • 2 years ago
          Anonymous

          AMD should end up beating Nvidia on price alone unless and 6000 series were a solid competitor outside of 4k. Considering the level of improvement the 6000 series was, I don't think its unreasonable to think that RDNA3 could work out

        • 2 years ago
          Anonymous

          Ray tracing destroys performance of games, it's pointless to enable ray tracing if you arent paying 2k dollars for a gpu (and at that point you would want to play at 4k, which the current gpus cant handle with rtx enabled either way).
          The only reason to buy nvidia now is if you want their meme AI garbage

          • 2 years ago
            Anonymous

            if you want 4k with everything maxxed out while having dlls3 turned on and full raytracing, then even the 40 series won't be able to pull that off. screenshot this, because dollars to nuts, the 40 series won't be able to pull consistent 70 frames per second in the most demanding games.

        • 2 years ago
          Anonymous

          >Gaytracing
          why yes i'd like to halve my framerate for a gimmick

        • 2 years ago
          Anonymous

          DLSS 3 doesn't even support the 3000 series lol, massive self sabotage from novideo.

        • 2 years ago
          Anonymous

          If you care more about rasterization amd is already the better buy today.
          I wish it had more gpgpu support but I can't be assed to care about upscaling videogames.

        • 2 years ago
          Anonymous

          >Gay tracing
          Who cares
          >FSR
          Good open source alternative that can be implemented everywhere
          >DLSS 3
          Ah yes, using AI to interpolate fake frames instead of having a higher framerate.

        • 2 years ago
          Anonymous

          Say whatever you want to say but Frame Interpolation is fundamentally BAD, don't try to shill that shit here on IQfy.

          • 2 years ago
            Anonymous

            It is the only way to reach HFR.

        • 2 years ago
          Anonymous

          >Ray-tracing

      • 2 years ago
        Anonymous

        >RDNA2 was great
        RDNA2 owner here, NO.

        • 2 years ago
          Anonymous

          Nvidia is moronic but AMD has started to get greedy so expecting prices to continue to be near nvidia's

          whats the problem

          • 2 years ago
            Anonymous

            I'm a RDNA2 owner too. What's the problem? Been serving me right.

            AMD Drivers. Video decoding makes the browser flash white, sometimes. ROCM is shite in SD.

          • 2 years ago
            Anonymous

            > ROCM is shite in SD.
            i don’t see the issue, runs fast enough and amd cards have plenty of ram

          • 2 years ago
            Anonymous

            I've had less trouble with AMD drivers on Windows than Nvidia.
            But regardless, I use Linux and only use Windows to play incompatible games.

        • 2 years ago
          Anonymous

          I'm a RDNA2 owner too. What's the problem? Been serving me right.

          • 2 years ago
            Anonymous

            Nvidiots keep harping on RT performance being the only metric that matters even if their own cards choke on it and are forced to disable it to keep-up FPS.

      • 2 years ago
        Anonymous

        >RDNA2 was great.

        • 2 years ago
          Anonymous

          Right, correction: RDNA2 is still great. 6950 XT is the king of 1440p.

        • 2 years ago
          Anonymous

          RDNA2 was great. AMD has consistently been delivering since 2019, or even arguably 2017.

          >RDNA2 was great.
          >Lost to 30 series in Ray-tracing
          >tried to shill FSR but got fricked and tried to shill 2.0
          IF 7000 is going to be better than 4000 in RT and will have something close to DLSS 3, maybe it will be great

          >RDNA2 was great
          RDNA2 owner here, NO.

          my ryzen 3700x almost outperforms my 5700xt in stable diffusion
          cpu is 3.5s/it
          gpu is 3.3s/it

          • 2 years ago
            Anonymous

            5700XT is RDNA1, moron.

          • 2 years ago
            Anonymous

            When you run SD on the AMD card, does the GPU peg at 100% usage?

    • 2 years ago
      Anonymous

      it'd be basically the same performance and $50 cheaper than nvidia then everyone will continue to buy nvidia because drivers. rinse repeat money speaks

      • 2 years ago
        Anonymous

        yep, welcome to the real world, sonny boi. the politicians lied to you when they told you competition would be good for your wallet.

    • 2 years ago
      Anonymous

      AMD needs a DLSS alternative. Some AI softwares too

      • 2 years ago
        Anonymous

        This. Nvidia are scummy but future proof, and if the gaming market can't find a good use for their overpriced technologies the commercial market will. Meanwhile AMD are trying to catch up in the gaming market and have nothing beyond that. Ideologically they live in 2015.

        • 2 years ago
          Anonymous

          >future proof
          their new version of dlss doesn't work on ampere, how is that future proof

          • 2 years ago
            Anonymous

            Cry harder

          • 2 years ago
            Anonymous

            Future proof as in actively advancing features that have a tangible commercial application X years from now, instead of just trying to beat their competitor in a gaming benchmark like it's 2005.

          • 2 years ago
            Anonymous

            incredible, the siemens ad actually works on people buying gaming gpus.

            datacenter applications are all about tflops and efficiency, not gimmicks

          • 2 years ago
            Anonymous

            That's why Tensor cores are a gimmick, right?

          • 2 years ago
            Anonymous

            they're great if you need a lot of fp16 compute, it just doesn't sound as cool and revolutionary if you put it like that

          • 2 years ago
            Anonymous

            Except I'm not buying them because I realize I'm not the target audience. Which is the entire point I was making. Nvidia have the better business leverage currently because they're putting more focus on a gigantic world of commercial applications, while AMD are still chasing after the limited desktop gaming market with seemingly no plan B for any other application. Just because my next card will be AMD doesn't mean AMD Radeon are currently set up better as a business.

          • 2 years ago
            Anonymous

            AMD is doing fine in HPC with CDNA

          • 2 years ago
            Anonymous

            Better buy the card capable of simulating your job as an Amazon warehouse employee then. Enjoy meeting your digital twin!

          • 2 years ago
            Anonymous

            RAW FPS is and always will be the only real metric to go on and not some meme tech like DLSS.
            >23FPS with goytracing

          • 2 years ago
            Anonymous

            AMD needs a DLSS alternative. Some AI softwares too

            This. Nvidia are scummy but future proof, and if the gaming market can't find a good use for their overpriced technologies the commercial market will. Meanwhile AMD are trying to catch up in the gaming market and have nothing beyond that. Ideologically they live in 2015.

            See this shit

            The best part is that the 4080 12GB might just barely match the 6900XT 16GB right now; Which has more VRAM and is currently being sold at $700

            ; They will barely match the high end on raster or price/perf and RDNA3 isn't even out yet.

          • 2 years ago
            Anonymous

            The best part is that the 4080 12GB might just barely match the 6900XT 16GB right now; Which has more VRAM and is currently being sold at $700

            Gigacope. Benchmarks haven't come in yet and people are already making wild claims that 4080 = 6900 XT (More like 3080 = 6900 XT)

          • 2 years ago
            Anonymous

            it probably could be made to work, but then you would most likely get shitty dlls performance and would you want that? why would anyone want that?

          • 2 years ago
            Anonymous

            But it does. It's a very specific feature that doesn't.

          • 2 years ago
            Anonymous

            No it doesn't.
            https://www.theverge.com/2022/9/20/23362990/nvidia-dlss-3-0-demonstration-ada-lovelace-graphics-cards-upscaling-technology

          • 2 years ago
            Anonymous

            Yes it does

          • 2 years ago
            Anonymous

            what's reflex?

          • 2 years ago
            Anonymous

            >Alongside our new GeForce RTX 30 Series GPUs, we’re unveiling NVIDIA Reflex, a revolutionary suite of GPU, G-SYNC display, and software technologies that measure and reduce system latency in competitive games (a.k.a. click-to-display latency). Reducing system latency is critical for competitive gamers, because it allows the PC and display to respond faster to a user’s mouse and keyboard inputs, enabling players to acquire enemies faster and take shots with greater precision.

            Some latency bs

        • 2 years ago
          Anonymous

          >Ideologically they live in 2015
          Good. They should ideologically live in 2005. Make a card that can push a ridiculous amount of pixels as fast as possible, and nothing more.

      • 2 years ago
        Anonymous

        FSR 2.1 isn't too terrible, it's not on top but it's not like it is leagues apart from DLSS 2

        The real question is if chiplets are a thing this generation and if they get a new video engine and fix ray tracing performance next generation. Since the nodes are the same, it will be a good time to see how behind AMD is to Nvidia. The fact that they only have raster performance locked down and the drivers are still lower quality than Nvidia except on Linux doesn't bode that well.

      • 2 years ago
        Anonymous

        >AMD needs a DLSS alternative. Some AI softwares too
        I think AMD has made good ground with FSR. I think they should carry on the algorithm route. Just like they did with freesync.
        It's free for every to use and is far more easier to implement.
        Nvidia does amazing work but they do it to flex. They seem to do things the hard way rather than easy

  7. 2 years ago
    Anonymous

    Fricken grandma's been on the roids

  8. 2 years ago
    Anonymous

    Why would you waste money of inferior piece of technology?

  9. 2 years ago
    Anonymous

    AMD doesn't give a shit about their high-end GPU's. They make bank on CPU's and console GPU's. High-end GPU's are waste of silicon to them. They'll allocate little supply and set high prices.

  10. 2 years ago
    Anonymous

    I have a 6900XT and won't upgrade this gen.
    RT performance is fine imo, the only games I use it with are Quake 2 and Doom Eternal though, soon Portal and maybe Morrowind, we'll see how that RTX mod support turns out on linux.

    Intel is pretty close to Nvidia in relative rt performance, RDNA should also catch up quite a bit.
    DLSS is a bit nicer than both XeSS and FSR2, but it's hardly game-changing and you see now how the proprietary nature means it'll turn worthless on a whim when Nvidia wants to market a new gen.

  11. 2 years ago
    Anonymous

    i believe in amd supremacy

  12. 2 years ago
    Anonymous

    The final solution to the nvidia problem

    • 2 years ago
      Anonymous

      Big oof, how did amd not catch this?

      • 2 years ago
        Anonymous

        shut up israelite. It's intended.

        • 2 years ago
          Anonymous

          Imagine being antisemitic in 2022... I'm so sorry for you.

          • 2 years ago
            Anonymous

            dabbin those spinning R's on u

          • 2 years ago
            Anonymous

            Press R to disagree
            AMD literally dunking on nJudea

          • 2 years ago
            Anonymous

            >Imagine being antisemitic in 2022
            Imagine being so sheltered and ignorant that you're not!

      • 2 years ago
        Anonymous

        Wild guess, but isn't it rather because it's being recorded on a phone?

        • 2 years ago
          Anonymous

          it's when you take a video and the framerate happens to match the rotation in a certain way (quarter turn between every frame or something). similar to those videos with a helicopter where the rotor doesn't appear to turn while they fly. you can't see that live, it'll just be blurry.

      • 2 years ago
        Anonymous

        >Big oof
        go back kiddo

    • 2 years ago
      Anonymous

      damn, if the graphics card was a little longer and took the entire case it would look very aesthetics.

    • 2 years ago
      Anonymous

      wtf I love AMD now

    • 2 years ago
      Anonymous
    • 2 years ago
      Anonymous
  13. 2 years ago
    Anonymous

    No prices will compensate the lack of CUDA

    • 2 years ago
      Anonymous

      Sadly this. My 970 still plays games fine but what I really want is more waifu iterations per second.

  14. 2 years ago
    Anonymous

    >implying the cheapest amd gpu wont be $490
    deluded

  15. 2 years ago
    Anonymous

    4090 $ 1,999
    4080 $ 1,699
    4070 $ 1,199

    • 2 years ago
      Anonymous

      Australian dollars

      • 2 years ago
        Anonymous

        Australian prices have not been released.
        Expect 1.5x or even 2x increase.

        • 2 years ago
          Anonymous

          Yes they have.

    • 2 years ago
      Anonymous

      They are rebranding themselves as a luxury brand.

    • 2 years ago
      Anonymous

      >starting
      Nvidia lost it even the 4090 is cut down as frick

      • 2 years ago
        Anonymous

        >not a single 48GB vram card
        Gay as frick

        • 2 years ago
          Anonymous

          Even 32 GB would have been great for the top end. Maybe in 2 more years™.

          • 2 years ago
            Anonymous

            >Even 32 GB would have been great for the top end. Maybe in 2 more years™.
            Man I hope rdna 4 has more vram
            16gb on my 6900xt isn't Gonna he enough

          • 2 years ago
            Anonymous

            leaks show that 7900XT has 24GB vram

          • 2 years ago
            Anonymous

            >32 GB Vram
            FOR WHAT!??
            Do you have any idea how fricking meaningless vram is for gaming once you reach a certain threshold. It The only spec that tells you jack shit about raw performance is tflops which Nvidia advertised poorly so brainlets see 16GB>12GB gigashits and think we need more

          • 2 years ago
            Anonymous

            >for gaming
            No, thanks, I have a job. 32GB is for compute.

          • 2 years ago
            Anonymous

            What games? There isn't a single game worth playing that you can't on a 10x0. The new cards are for AI.

          • 2 years ago
            Anonymous

            >3080Ti consistently higher 1% lows
            top kek, is this a general phenomenon or did this guy do something weird with his setup

  16. 2 years ago
    Anonymous

    >same rasterization performance but minimal gimmicks (and open source anyway)
    >excellent Linux drivers baked right into the kernel
    >cards that aren't overvolted out the wazoo
    >more sanely priced
    Yeah, I'm thinking AMD's back.

    • 2 years ago
      Anonymous

      Freetards are a different breed, I swear.

      • 2 years ago
        Anonymous

        >DLSS now locked out to anyone but the people who will take out a loan to buy the latest housefire cards because Nvidia knows they have nothing else going for them
        Absolutely no technical reason as to why it shouldn't work on at least 3000 series but hey, Nvidia knows that they need to pull Intel israelite tricks if you want to be seen as the "best" by morons regardless if your product catches fire or not.

        • 2 years ago
          Anonymous

          >DLSS now locked out to anyone but the people who will take out a loan to buy the latest housefire cards because Nvidia knows they have nothing else going for them
          Do Nvidia train their deep learning in every available DLSS version for compatible games, or only in the most recent one? In other words, can you also play a DLSS 3 compatible game with DLSS 2, or only with DLSS 3?

          • 2 years ago
            Anonymous

            Nobody knows because DLSS 3 doesn't exist yet

          • 2 years ago
            Anonymous

            >DLSS now locked out to anyone but the people who will take out a loan to buy the latest housefire cards because Nvidia knows they have nothing else going for them
            Absolutely no technical reason as to why it shouldn't work on at least 3000 series but hey, Nvidia knows that they need to pull Intel israelite tricks if you want to be seen as the "best" by morons regardless if your product catches fire or not.

            DLSS 3 is frame interpolation, not upscaling. It isn't not even the same technology as 2

          • 2 years ago
            Anonymous

            Yeah but if a developer pays to implement it, can't you train your thing for both technologies simultaneously?

          • 2 years ago
            Anonymous

            >they combined optical flow with the tensor cores for bullshit motion interpolation and this somehow justifies overpriced cards
            BRAVO NVIDIA

          • 2 years ago
            Anonymous

            NVIDIA defines the GPU market. They can do whatever they want. If you want that to change, stop buying their fricking products.

          • 2 years ago
            Anonymous

            How do I do that when I'm Stably Diffusing waifus?
            >inb4 ROCM
            Maybe they should get it working properly then.

          • 2 years ago
            Anonymous

            This. Between stable diffusion and the upcoming RTX Remix porn mods, Nvidia has the coomer niche cornered.
            AMD has to step up their game.

          • 2 years ago
            Anonymous

            what doesn't work properly?

          • 2 years ago
            Anonymous

            >Maybe they should get it working properly then
            It works properly. Stop spreading shill talking points.

          • 2 years ago
            Anonymous

            The best part is that the 4080 12GB might just barely match the 6900XT 16GB right now; Which has more VRAM and is currently being sold at $700

          • 2 years ago
            Anonymous

            It's both, it's just DLSS 2 with frame interpolation added after upscaling.

          • 2 years ago
            Anonymous

            TAA still there? Into the trash it goes.

    • 2 years ago
      Anonymous

      >more sanely priced
      I'll believe it when I see it.

  17. 2 years ago
    Anonymous

    >can't run AI

  18. 2 years ago
    Anonymous

    equivalent product will be $50 cheaper best case scenario while having worse ray tracing performance and fsr is still shit

    what a bargain, mama su saves us yet again

    • 2 years ago
      Anonymous

      b-but muh FOSS drivers

    • 2 years ago
      Anonymous

      Correct. AMD has shareholders to answer to.

  19. 2 years ago
    Anonymous

    Still have my never obsolete GTX 1080.

    • 2 years ago
      Anonymous

      Same here, with 6700K. I replaced DLSS with FSR2.1 in Cyberpunk 2077, and now I know why Nvidia didn't allowed DLSS on GTX1000 series.

  20. 2 years ago
    Anonymous

    >tfw not a ramen-eating wagie and can afford Nvidia

    Feels so good, bros.

    • 2 years ago
      Anonymous

      I once bought a motorcycle on craigslist for less than the cost of a 4090. I am not willing to pay more for a GPU than I did a vehicle I rode for 5 years.

  21. 2 years ago
    Anonymous

    when is AMD's event??????

  22. 2 years ago
    Anonymous

    Rumor has it RDN3 goes to 4 Ghz

  23. 2 years ago
    Anonymous

    The reality is AMD GPUs have a long way to go before being able to match Nvidia in features, when you buy an Nvidia GPU you will pay the Jensen Tax™ but it's undeniable that you can do a lot more with it.
    Sure if you are only interested in games and pushing more pixels without any trickery and graphical gimmicks AMD is fine but for everything else they are behind.

    FSR needs more work to compete with DLSS, AMD media engine is not as good, especially their encoders, and when you factor in AI, SDKs, libraries, CUDA it becomes clear that they are far apart, hell leather jacket man spent 90% of the presentation talking about self-driving cars, simulation system, modelling system, development frameworks, a full Nvidia software stack, does AMD have an answer for any of that? It's easy to see why Nvidia GPU are gobbled down by enterprise and academia, while AMD ones are basically nonexistent.

    • 2 years ago
      Anonymous

      >FSR needs more work to compete with DLSS
      Who cares? You don't buy a $2000 GPU to upscale games.
      >AMD media engine is not as good
      It's fine. Not quite as good, but the difference doesn't matter unless you're a professional streamer. For offline transcoding or professional work you would be laughed out of the room for suggesting GPU encoding.
      >and when you factor in AI, SDKs, libraries, CUDA
      Irrelevant for consumer cards.

      • 2 years ago
        Anonymous

        High-concentration cope.

        • 2 years ago
          Anonymous

          High-concentration truth. Facts don't care about your feelings.

      • 2 years ago
        Anonymous

        High-concentration truth. Facts don't care about your feelings.

        >Irrelevant for consumer cards
        Then why are the best consumer LCZero and Stable Diffusion setups based on CUDA?

        • 2 years ago
          Anonymous

          You don't buy a GPU to generate anime breasts for a week before you get bored.

          Yes. Nvidia is for the PRO(tm) experience. The "A" in AMD is for Amateur.

          But you don't even have a job, you stinky NEET.

          • 2 years ago
            Anonymous

            Trying to insult me doesnt change reality, I don't understand why you deny hte obvious truth. Seriously

          • 2 years ago
            Anonymous

            I didn't deny anything. You don't need the PRO(tm) features because you don't have a job.

            If your understanding of DLSS is that it just 'upscales games', you might be more moronic than you initially appeared.

            I know, as of today it also does that frame interpolation thing that shitty TVs do.

            What the frick is a VMAF, a AMF, a NVENC or a QSV?

            VMAF: https://github.com/Netflix/vmaf
            AMF: AMD Advanced Media Framework
            NVENC: NVIDIA Encoder
            QSV: Intel Quick Sync Video

          • 2 years ago
            Anonymous

            Thanks.

      • 2 years ago
        Anonymous

        Yes. Nvidia is for the PRO(tm) experience. The "A" in AMD is for Amateur.

      • 2 years ago
        Anonymous

        If your understanding of DLSS is that it just 'upscales games', you might be more moronic than you initially appeared.

      • 2 years ago
        Anonymous

        What the frick is a VMAF, a AMF, a NVENC or a QSV?

      • 2 years ago
        Anonymous

        You don't buy a gpu to upscale games at all.
        People that use DLSS are the same people that use FXAA and say it's the best thing since bread, being so clouded that they don't even see the smear it makes when anything moves or the extreme lack of detail and shimmering that is introduced.

        • 2 years ago
          Anonymous

          >People that use DLSS are the same people that use FXAA and say it's the best thing since bread
          People that play one game with a poor TAA implementation and play at 1080p and think that this is a blanket situation for every game with TAA are even more stupid.
          >makes when anything moves or the extreme lack of detail and shimmering that is introduced.
          Not a problem if you aren't playing at 1080p. DLSS has nothing to do with TAA as well. In many games running DLSS with no sharpening or TAA is preferable to many engines shitty default TAA implementation. Learn the difference between the two technologies before you talk shit about other people.

          • 2 years ago
            Anonymous

            TAA can't be implemented well, it's shit by design frame averaging smoothing. Worse than motionblur.
            If you don't see how blurry TAA makes things you need to get glasses, I'm serious. Even at 4k it destroys all form of sharpness.

            >Learn the difference between the two technologies before you talk shit about other people.
            Your reading comprehension is as good as your taste in anti-aliasing technologies.

          • 2 years ago
            Anonymous

            >DLSS has nothing to do with TAA as well.
            It is literally TAAU but using a neural network model to control the upsampling as opposed to a simple filter.

          • 2 years ago
            Anonymous

            >TAA
            The frick is TAA?
            Anti-aliasing at 4K and higher is simply not required.
            The problem here is that any upscale doesn't give a pixel-perfect and identical image, and what's worse, journos are shifting from it. "look how well it feeds us 1080p sources, we don't even notice" is the new tech meme. People stopped caring for raw perf and crisp image, what even the point in 8K then.

      • 2 years ago
        Anonymous

        >professional streamer
        Name one big streamer that cares about encoding.

        • 2 years ago
          Anonymous

          Linus Sebastian.

          • 2 years ago
            Anonymous

            He's not even a small twitch streamer. Big youtuber shill, but his twitch streams barely pass 300 people on average.

      • 2 years ago
        Anonymous

        >Irrelevant for consumer cards.
        Which is why these features are present in said consumer cards and why a lot of profesionals and companies buy them. Huh.

  24. 2 years ago
    Anonymous

    What the frick happened to the days of 300 or so for low, 400-500 for mid and 500-600 for high end cards? I think I paid around 500 or so for my RTX 2070 in 2019(?) and I'm content with keeping that until prices return to what I would consider normal (if they ever do).

    • 2 years ago
      Anonymous

      Increasing production costs with node shrinks, inflation and greed.

      Mainly greed.

    • 2 years ago
      Anonymous

      do you have any idea how much inflation the western countries have amassed these last couple of years? do you have any idea how little our money is actually worth right now? our dear leaders work their asses out of their pants trying to get the inflation "under control" but it's not really working they way they want it to. this is just the beginning, wait a couple of more years and the great reset will begin in earnest. when you all go nutzo and kill all your politicians, don't forget to also kill all the israelites behind them. especially the rothschild israelites.

    • 2 years ago
      Anonymous

      Yeah remember how much you could've bought for a dollar in the 1930s? THese times are long gone and will never return. Inflation never comes back, and if it does, it's even worse.

  25. 2 years ago
    Anonymous

    >400 series
    >a massive fire hazard
    >4000 series
    >a gigantic fire hazard
    I'm starting to see a pattern here with the number 4.

    • 2 years ago
      Anonymous

      >I'm starting to see a pattern here with the number 4.

      > asians leave out the number 4 out of superstition due to syhounding like death

    • 2 years ago
      Anonymous

      GeForce 4 series was kino, though.

    • 2 years ago
      Anonymous

      Novidya will always be housefire garbage in real world performance.

      • 2 years ago
        Anonymous

        >VRAMlet
        Into the trash IQfyermin

  26. 2 years ago
    Anonymous

    In Austria they capped electricity to 10ct/kWh. Heating with a 4090 is cheaper than burning gas.

  27. 2 years ago
    Anonymous

    Memes aside how are the AMD drivers at this moment of time?
    I've been reading it continually improves but how does it compare to Nvidia nowadays when it comes to both AAA and indie games?

    • 2 years ago
      Anonymous

      >Memes aside how are the AMD drivers at this moment of time?
      On windows?
      >functional but worse video encoding performance and lack of day1 game drivers is a gamble
      on Linux?
      >Quite good, baked right into the kernel so it should just work. Tends to run cooler and be more performant than on Windows.

      • 2 years ago
        Anonymous

        >Lack of day 1 game ready drivers
        AMD fixed this issue in 2019 or earlier.
        >5500 XT and 6800 XT user

  28. 2 years ago
    Anonymous

    this is a terrible situation, amd can make card 10 to 20 cheaper compared to nvidia, it will still be at a moronic price.

  29. 2 years ago
    Anonymous

    The redemption is coming.

  30. 2 years ago
    Anonymous

    Five hundred and nintey nine US dollars

  31. 2 years ago
    Anonymous

    It will be more like

    >We will adjust our prices acordling to yours
    >Oh nononono, we were going to charge only $1000 for the RX 7950
    >Now it is just $1500, $100 cheaper than yours.

  32. 2 years ago
    Anonymous

    AMD really needs to so something about CUDA. They need something opensource that uses to replace the black CUDA box NVIDIA is using.

    • 2 years ago
      Anonymous

      >They need something opensource that uses to replace the black CUDA box NVIDIA is using.
      They already have that, it's just that it isnt as good as the original thing.

      Either AMD prices their new GPU line very agressively, or it's still a no brainer to get a nvidia gpu. Their drivers also suck so yea, if they dont even have good pricing AMD gpus will have nothing going on for them

      • 2 years ago
        Anonymous

        >it's just that it isnt as good as the original thing.
        And never will be. On top of their poor support, slow driver releases, and lack of libraries, they've made it clear that RDNA and any consumer cards are secondary to their CDNA enterprise cards.

    • 2 years ago
      Anonymous

      CUDA is already deeply entrenched in all professional ML

      It's over. You'd have more luck getting corporate off windows

  33. 2 years ago
    Anonymous

    Where is your CUDA equivalent again?

    • 2 years ago
      Anonymous

      https://github.com/RadeonOpenCompute

      • 2 years ago
        Anonymous

        Can I run PyTorch on it on Windows?

        • 2 years ago
          Anonymous

          Officially only on Linux atm, but I think you could be able to get it running on Windows with WSL.

          • 2 years ago
            Anonymous

            >WSL
            So only on Linux

            Though to be fair the rocm source has many indication that windows support is in active development.

        • 2 years ago
          Anonymous

          Dunno. But Windows is a toy OS.

      • 2 years ago
        Anonymous

        >doesn't work on the latest Ubuntu LTS
        t-thanks, AMD

    • 2 years ago
      Anonymous

      >Where is your CUDA equivalent again?
      All the software I use is OpenCL optimized.

  34. 2 years ago
    Anonymous

    They'll cut Nvidia by 10% and leave any posible market share gain off the table.
    It's been like this like since forever, if you think otherwise you're simply deluded.

  35. 2 years ago
    Anonymous

    Is it time for Massive Noavi?

  36. 2 years ago
    Anonymous

    she should really have that growth on her left cheek removed, it looks malignant

    • 2 years ago
      Anonymous

      That's a headset microphone.

      • 2 years ago
        Anonymous

        y..you're a headset microphone

        • 2 years ago
          Anonymous

          For you.

  37. 2 years ago
    Anonymous

    gpus are getting bigger and bigger. what the frick is this shit. i dont want to buy a cupboard and a powerplant to run a 40xx card

    • 2 years ago
      Anonymous

      >i dont want to buy a cupboard and a powerplant to run a 40xx card
      then you simply don't want top of the line performance, which is ok

  38. 2 years ago
    Anonymous

    RDNA3 is going to be a disappointment for desktop, what I'm looking forward to is the mobile sector. We're about to get some real potent handheld and laptop APUs with almost 1060-tier performance using only 15W.

    • 2 years ago
      Anonymous

      Yep, this is where iGPUs start to beginning their path of demand destruction while discrete GPU move upwards.

  39. 2 years ago
    Anonymous

    AMD will have the same price increase as Nvidia did because of TSMC, stop being poor

    • 2 years ago
      Anonymous

      no it won't, AMD is using N5 not N4 and they're using half of what NVIDIA is using plus N6 for cache they can charge less and make high margins.
      and yes NVIDIA is just gouging.

  40. 2 years ago
    Anonymous

    Save us GPU chinkmother.

  41. 2 years ago
    Anonymous

    Can someone validate the claim in this video? Also, this was published in January, and I believed Amd had an update in August that boost the performance for Machine Learning using DirectML. So, is Amd a viable option for ML now?

    • 2 years ago
      Anonymous

      No, AMD is in no way close to being viable for GPU compute. They hardcore their Python dependency for ROCm to 3.8, while at the same time not shipping Python with ROCm.
      Their level of incompetence is astonishing, this is something a beginner could fix in a day.

      • 2 years ago
        Anonymous

        >They hardcore their Python dependency for ROCm to 3.8, while at the same time not shipping Python with ROCm.
        In 2022, you download and unpack the required Python version via Ansible/Puppet/Salt yourself and put it to /usr/local/bin and lib, together with running update-alternatives. Or venv. Or use package manager packages if versioned. Why ship something that will for sure be vulnerable when people install updated packages by themselves all the time.

        • 2 years ago
          Anonymous

          ROCm supports "Ubuntu LTS" (what exactly this means is not mentioned), and the latest LTS comes with Python 3.10. Due to the hardcoded dependency on 3.8 they can't support the latest "Ubuntu LTS". The code itself has no issues running on 3.10, it's a packaging bug they can't fix for almost a year.

          Let that sink in – the company you expect to "enable AI" and give you tools and drivers for GPU compute can't figure out how to change a stupid dependency... IN A YEAR.
          >inb4 Jammy has only been out for 6 months
          https://github.com/RadeonOpenCompute/ROCm/issues/1612

          • 2 years ago
            Anonymous

            Probably 18.04. VMWare Greenplum supports 18.04, every corporate thing like Cirtix supports 18.04 (hope they do, they were on 16.04 a year ago). If 20.04, that's a step up for sure already.
            > Let that sink in
            Dude, I swim in that shit, don't even start.

  42. 2 years ago
    Anonymous

    Thats a man

    • 2 years ago
      Anonymous

      MA'AM
      IT'S MA'AM!
      *hits you with a bat*

  43. 2 years ago
    Anonymous

    PC hardware is becoming a status symbol like owning the lastest Iphone.

    people will just buy the 4090 to say look I got the 4090 just to play 1080p 60 FPS

    • 2 years ago
      Anonymous

      for a tiny subset of rich people. most of them will still get nice cars and watches like they used to.
      the 40 series will be an absolute disaster. miners don't want them (the perf per dollar and watt are abysmal, mining is likely dead anyway after ETH finally walked the walk) and scalpers have already taken too many losses on the 30 series to try again so soon. your favorite tech youtuber and streamer will get one, but absolutely everyone else will skip this and maybe get a used 3090 instead for a quarter of the price and 80% of the performance.

  44. 2 years ago
    Anonymous

    It's likely AMD would have considered launching the 7900XT at 1500 if Nvidia had chosen 2000 for the 4090, but they went more aggressive.
    There's also no chance the 4080 price sticks, it's ridiculous, once 3080 stock clears they'll have to release a Super series to rectify prices.

  45. 2 years ago
    Anonymous

    Still buying a 4090. Still running textual inversions for SD. Sorry seething neets

    • 2 years ago
      Anonymous

      the 4090 is the only one you should buy if you have the need, both 4080 skus are just jensen pissing in your face

      • 2 years ago
        Anonymous

        Could be. I'm not poor so i just need VRAM for the future of art

  46. 2 years ago
    Anonymous

    You homosexuals still think AMD is going to save you after what happened with Zen 3.

    • 2 years ago
      Anonymous

      RDNA3 is the Zen2 of Radeon

    • 2 years ago
      Anonymous

      It's in their own interest to gain back market share, they have a chance right now to decimate the 4080 in price/performance which is arguably the more important than the crown

  47. 2 years ago
    Anonymous

    Is amd releasing new cards soon? if so when?

    • 2 years ago
      Anonymous

      November 3rd

      • 2 years ago
        Anonymous

        frick

  48. 2 years ago
    Anonymous

    nvidia on windows, radeon on linux
    drivers is the most important thing

  49. 2 years ago
    Anonymous

    >here's your shitty drivers anon
    No thanks, even if RDNA3 was 30% faster than 4090, I still wouldn't buy it because I don't want to deal with AMD headaches.

    • 2 years ago
      Anonymous

      >le bad drivers
      why do redditors parrot this

      • 2 years ago
        Anonymous

        I had an AMD card for 4 years, the mediocre driver shit is true. Their drivers simply aren't as good, their driver team isn't as big as Nvidia's. Why do simple facts make you people seethe so much? Who cares?

    • 2 years ago
      Anonymous

      If you think AMD has bad drivers then you haven't seen nvidias shit yet

  50. 2 years ago
    Anonymous

    I had an Nvidia card for 4 years, the mediocre driver shit is true. Their drivers simply aren't as good, their driver team isn't as big as AMD's. Why do simple facts make you people seethe so much? Who cares?

    • 2 years ago
      Anonymous

      I ran Nvidia cards from 2015-2019 (970 and 2060) and had several instances of driver issues, mainly blue screens and broken Freesync. I've ran AMD since 2020 (5500 XT and 6800 XT) and my only issue is AMD Software will sometimes close randomly when adjusting settings. So both have issues, less severe on the AMD side.

      • 2 years ago
        Anonymous

        >Freesync
        oh I see you're just lying

      • 2 years ago
        Anonymous

        Same the Nvidia is more reliable meme it's just Nvidia marketing both my laptop Nvidia card and my 2070 have had shitty problems with bluescreens and some problems with suspend

  51. 2 years ago
    Anonymous

    I had an Nvidia card for 4 years, the mediocre driver shit is true. Their drivers simply aren't as good, their driver team isn't as big as AMD's. Why do simple facts make you people seethe so much? Who cares?

  52. 2 years ago
    Anonymous

    Maybe I'll just stick to integrated graphics, buy an Xbox, and get one of those keyboard/mouse adaptors. GPU pricing has gone completely insane over the last 6 years.

    • 2 years ago
      Anonymous

      yeah, was kinda thinking along those lines a few months ago but then again I play way too much paradox shit like HoI4 and EU4 for console to maka a lot of sense. Finally I went for rtx 3070ti system and hope for the best. Now they say my PC should arrive friday next week so it seems like waiting never ends

    • 2 years ago
      Anonymous

      RDNA 2 In every Ryzen 4 has me curious.

      Get an NH-P1, nvme drives in raid 0, and you’ll have a pretty insane workstation.

  53. 2 years ago
    Anonymous

    AyyMD will raise their prices as well just because they can, duopoly sux.

  54. 2 years ago
    Anonymous

    No fast video encoding, no DLSS, into the trash it goes. I'm a proud oculus link cuck.

    • 2 years ago
      Anonymous

      FSR 2.0 is pretty great.

      • 2 years ago
        Anonymous

        and it runs on everything so it doesn't leave GTX 980ti/1000 series chads to rot

      • 2 years ago
        Anonymous

        now post it in motion with all the dogshit ghosting

  55. 2 years ago
    Anonymous

    lol you think they can pump out 100 times more GPUs this time and for a "fair" price?
    The prices will be as "fair" as the new X670E boards. open your butthole for 7900XT starting at 1299$

    • 2 years ago
      Anonymous

      Kinda glad I only spent 1k on my 6900xt
      If rdna 3 is stupid pricing I'll wait for rdna 4 as it will have to compete with a more shreud nvidia and Intel

  56. 2 years ago
    Anonymous

    i just want a PURE RASTER CARD FOR FRICK SAKE ! i want a x70 class for 350$/400 dollars like the goold old days. even shintel is doing meme tracing shit, thus can t have low prices.

    • 2 years ago
      Anonymous

      So go get a 6800xt?

      Sadly this. My 970 still plays games fine but what I really want is more waifu iterations per second.

      Apparently 4090 can do sd 512x per second

    • 2 years ago
      Anonymous

      7700XT $500

      • 2 years ago
        Anonymous

        some Drones unironically writing this after the AM5 prices. LMAO

        • 2 years ago
          Anonymous

          The AM5 7XXX desktop CPU prices are mostly reasonable, except at the lower-end. Also, how AMD decides to price RDNA3 will depend on how aggressively they want to try and take market share from Nvidia. If they believe they can sell through everything they make at Nvidia-equivalent card -$50 prices, AMD might decide to do that. But if they are focused on stealing GPU marketshare this generation, I wouldn't be surprised to see somewhat aggressive pricing to make the 4080 and the "4080" look worse.

          • 2 years ago
            Anonymous

            for aggressive market share accumulation you need stock. Looking at every GPU stat that exists shows that AMD can't even pump out 1/20 of the stock that Nvidia has. They would need to pump out millions of GPUs more this time around.

            I'm not holding my breath

  57. 2 years ago
    Anonymous

    I don’t think they’ll compete with the 4090 they never make the best GPU any given cycle.

    Maybe they’ll be competitive with nvidias last gen lol

  58. 2 years ago
    Anonymous

    What's everyone's favorite pejorative for Nvidia? I'm between Njudea and Shitvidia

    • 2 years ago
      Anonymous

      ? They are the White People of GPUs, there's no slur. Meanwhile AMD are the pajeets

    • 2 years ago
      Not paid by rockstar

      Novideo

    • 2 years ago
      Anonymous

      Their fan bois are Nvidiots

  59. 2 years ago
    Anonymous

    In order for AMD to win
    >meet Nvidia performance in trad rasterization
    >keep improving FSR
    >keep improving their hardware encoding
    >fix their fricking windows drivers and finally pay a team to do day-1 game driver updates.
    You can sidestep the driver issue if you're running Linux. That leaves it to them just staying the course and focusing on pure performance and efficiency without wasting time on motion interpolation gimmicks or other consoomer shit.

  60. 2 years ago
    Anonymous

    The AMD 7000 GPUs are going to be on the 5nm process, right? And isn't Nvidia on 4nm, which is more expensive?
    And RDNA 3 should have at least a few models that are chiplet based, which can help a lot with costs as well.

    I'm not saying AMD will be the good guy and pass real, major savings onto the customer, but there is a shred of possibility, right? I'm still not getting my hopes up because the GPU market has been absolute cancer for 6 years.

    • 2 years ago
      Anonymous

      A 7900xt at 999$ with roughly 75% the performance of a 4090's performance is all they need.

      • 2 years ago
        Anonymous

        Why do you think it'll only be 75%? The 6900XT matched the 3090 in a lot of things.

        • 2 years ago
          Anonymous

          720p gaming doesn't count, Chang.

          • 2 years ago
            Anonymous

            Here's your 4k benchmark bro.

          • 2 years ago
            Anonymous

            >"""4K"""
            >3840x1600
            >Low

          • 2 years ago
            Anonymous

            I got you bro. Notice that this only has up to the 6800XT

    • 2 years ago
      Anonymous

      >I'm not saying AMD will be the good guy and pass real, major savings onto the customer, but there is a shred of possibility, right?
      eh. A little bit
      >the GPU market has been absolute cancer for 6 years.
      You can thank the whole "le pc master race XD" meme marketing strategy. When PC gaming was just "pc gaming" and the ugly duck compared to consoles, we actually had it good and didnt knew better. It was actually worth it to play on PC because there was no status attached to it.

      • 2 years ago
        Anonymous

        I'm very aware of the PC Mustards and the damage they did. The R9 290 was my first GPU. $320 on Newegg after a $60 rebate. This talk about a mid tier GPU costing $500 or more is moronic.

    • 2 years ago
      Anonymous

      TSMCs 4N isn't 4nm but 5nm.

    • 2 years ago
      Anonymous

      RDNA3 is 5nm and 6nm, the cache is 6nm and the GCD is 5nm.
      7600XT is monolithic

  61. 2 years ago
    Anonymous

    Don't care, I'm buying an Arc A770

    • 2 years ago
      Anonymous

      I just wish PyTorch supported oneAPI (Intel's equivalent to CUDA) so I could run Stable Diffusion on it.

      • 2 years ago
        Anonymous

        Just use the ONNX models

  62. 2 years ago
    Anonymous

    The AMD prices will also be shit, won't they...
    It's over

    • 2 years ago
      Anonymous

      It's what we call "competitive".

      • 2 years ago
        Anonymous

        competitive to whom.? You think the average Joe has 1k laying around to spend on GPUs homosexual.

        • 2 years ago
          Anonymous

          You didn't get the joke, try now.
          >competitive
          As in they compete with each other at being shit.
          nV clearly doesn't care what AMD does they play their own game and AMD also won't just lower the price if they can sell high.

        • 2 years ago
          Anonymous

          To another competitor, duh. Ask the same question about a wage "compensation".

  63. 2 years ago
    Anonymous

    AMD doesn't do their own thing. They are hyper aware of the current markets, and will adjust their prices accordingly. They don't seem to interested at disrupting the markets. We need the Chinese for that. The GPU markets have not seen their Android phones yet.

    • 2 years ago
      Anonymous

      > comparing silicon chips to assembled micro computers
      Maybe you mean "have not seen their Mediatek yet."

  64. 2 years ago
    Anonymous

    yeah, enjoy your 50 bucks discount. man, competition sure is grrrreat!

  65. 2 years ago
    Anonymous

    AMD really has us stuck between a rock and a hard place. I'm actually stressing over this a bit. Maybe I'll just buy an Intel CPU and use their decent integrated graphics instead.

  66. 2 years ago
    Anonymous

    honestly, without a ram bump, I don't see the point. I have a 3080 and I have yet to use DLSS. tried it in some games but unless I put it on highest, the gain is lackluster. but if I put it on lower quality - to actually get a proper boost - the game looks like shit.

  67. 2 years ago
    Anonymous

    >Want a GPU that won't empty my wallet
    >Also really want good ray tracing

    Used it is.

    • 2 years ago
      Anonymous

      RDNA 3 will handily beat Ampere in RT

      • 2 years ago
        Anonymous

        Not that Anon but I hope so, I'd love to get a slightly cut down top tier die and live the R9 290 dream I couldn't afford all those years back.

        • 2 years ago
          Anonymous

          >R9 290
          The only 1080p 60FPS card you ever needed for 7 years.

          • 2 years ago
            Anonymous

            GCN was so good

      • 2 years ago
        Anonymous

        RDNA3 beating Ampere in RT is all I really care about.
        Ampere has good ray tracing perf in blender. That's all I need.

        General question:
        I was looking to refresh my PC, but the 4000 series sounds like cancer and Intel's Raptor Lake platform sounds like a mixed bag.

        If I'm using a system with an i7 4790k and a GTX 980 currently, which is now obviously a little old and powerhungry, is a 12th gen Intel CPU and a 3070±Ti a good match to last me ~5 years of light gaming?
        (I just want something that can do 1440p and still be crammed into a small case for a HTPC/couch)

        5800X3D is a good upgrade. It's 2.5-5x faster than your current CPU in games. Massive upgrade and they've been on sale lately plus DDR4 is cheap.

        Since you lasted on a 4790k so long, I don't think getting last gen DDR4 platform will matter much for you. Sounds like you don't want to bother upgrading much or you'd already have upgraded to Ryzen when the 3000 series came.

        If you DO plan to upgrade more often, than DDR5 and Ryzen 7000 would be worth it since, as long as your board doesn't break, you can use the same board 5 years later.

        >3070/TI
        One of the worst offerings Nvidia has. The RX 6800 non-XT is a lot better.
        The 3070 is only like 8% better than the 3060Ti and it costs like 30% more. If you must get Nvidia, try and grab the $400 3060Ti from Bestbuy when they're in stock. That's a huge upgrade over your gtx 980 which is worse than the 6500xt.
        You could also wait for RX 7000 series. The 5800X3D should be powerful enough to power it at 1440p.

  68. 2 years ago
    Anonymous

    >one of the games I play doesn't recognize the recommended driver
    >another one of the games I play crashes with the optional driver
    Also what the frick is this

    • 2 years ago
      Anonymous

      Laptop or desktop?

  69. 2 years ago
    Anonymous

    They are related you know.

    • 2 years ago
      Anonymous

      None of the reddit threads ever mention this detail.
      I dont have an account or I would stir that shit up

      • 2 years ago
        Anonymous

        >I dont have an account or I would stir that shit up
        You'd get permabanned for it within a few days at most.

  70. 2 years ago
    Anonymous

    nvidia and amd are family

    they are not going to fight each other on prices

  71. 2 years ago
    Anonymous

    homosexuals will go out and get it, anyway. They will always go against their self interest, when the fact is, NOT buying this shit is in their self interest. Do severe damage to Nvidia so they will learn a lesson.

    • 2 years ago
      Anonymous

      Literally me anytime a new patch drops for a game

    • 2 years ago
      Anonymous

      Literally me anytime a new standard drops for C++

    • 2 years ago
      Anonymous

      Who pays for this?

      • 2 years ago
        Anonymous

        the same people who had a 3090ti and are now getting a 4090

    • 2 years ago
      Anonymous
  72. 2 years ago
    Anonymous

    General question:
    I was looking to refresh my PC, but the 4000 series sounds like cancer and Intel's Raptor Lake platform sounds like a mixed bag.

    If I'm using a system with an i7 4790k and a GTX 980 currently, which is now obviously a little old and powerhungry, is a 12th gen Intel CPU and a 3070±Ti a good match to last me ~5 years of light gaming?
    (I just want something that can do 1440p and still be crammed into a small case for a HTPC/couch)

    • 2 years ago
      Anonymous

      If you don't care about 4k or raytracing then it's more than enough.

      • 2 years ago
        Anonymous

        if you don't want power hungry, don't go current gen intel

        Get a 5800X3D. It's the best gaming CPU you can buy right now and should easily last you that long. The 3070 is alright, but the 6800 XT is around the same price and should be better than it in most areas. DLSS is very overrated. Upscaling in general is overrated.

        RDNA3 beating Ampere in RT is all I really care about.
        Ampere has good ray tracing perf in blender. That's all I need.

        [...]
        5800X3D is a good upgrade. It's 2.5-5x faster than your current CPU in games. Massive upgrade and they've been on sale lately plus DDR4 is cheap.

        Since you lasted on a 4790k so long, I don't think getting last gen DDR4 platform will matter much for you. Sounds like you don't want to bother upgrading much or you'd already have upgraded to Ryzen when the 3000 series came.

        If you DO plan to upgrade more often, than DDR5 and Ryzen 7000 would be worth it since, as long as your board doesn't break, you can use the same board 5 years later.

        >3070/TI
        One of the worst offerings Nvidia has. The RX 6800 non-XT is a lot better.
        The 3070 is only like 8% better than the 3060Ti and it costs like 30% more. If you must get Nvidia, try and grab the $400 3060Ti from Bestbuy when they're in stock. That's a huge upgrade over your gtx 980 which is worse than the 6500xt.
        You could also wait for RX 7000 series. The 5800X3D should be powerful enough to power it at 1440p.

        How about this?
        Sorry I know it's not the thread for it, but thanks for the pointers.

        • 2 years ago
          Anonymous

          Why the frick would you put a 5800x3D against a 3060Ti? Adding a high-end CPU against a low-mid range GPU is a complete waste of money. CPU will not be bottle-necking you unless you are playing lots of computationally expensive strategy games.
          https://cpu.userbenchmark.com/Compare/AMD-Ryzen-7-5800X3D-vs-AMD-Ryzen-5-5600X/m1817839vs4084
          Save you money and upgrade your SSD. Don't be a 500GB cuck, get at least 1TB. Also, you're in Oz so go to MWave, they are having a end of generation sale atm, you'll get better deals across your entire spend and it will be a single delivery.

    • 2 years ago
      Anonymous

      if you don't want power hungry, don't go current gen intel

    • 2 years ago
      Anonymous

      Get a 5800X3D. It's the best gaming CPU you can buy right now and should easily last you that long. The 3070 is alright, but the 6800 XT is around the same price and should be better than it in most areas. DLSS is very overrated. Upscaling in general is overrated.

  73. 2 years ago
    sage

    anyone else jerk it to Lisa Su? or just me?

Your email address will not be published. Required fields are marked *