High end graphics cards are useless, prove me wrong

Home Forums Science & tech High end graphics cards are useless, prove me wrong

Viewing 27 reply threads
  • Author
    Posts
    • #103197
      Anonymous
      Guest

      Even if you *need* more, you don’t need *more*.
      Prove me wrong.

    • #103200
      Anonymous
      Guest

      you’re right.

    • #103203
      Anonymous
      Guest

      Try rendering a 5 minutes animation in scene with 50 assets with at least 4 textures each and 10 light sources

      • #103443
        Anonymous
        Guest

        Sounds gay. How about you stop making CGI junk?

      • #103485
        Anonymous
        Guest

        that’s realtime on a 10 year old computer, anon. you need to be more specific about what and how you render if you want to sound like a big boy.

      • #103564
        Anonymous
        Guest

        OH GOD I’M RENDOOOORING

    • #103223
      Anonymous
      Guest

      you just need to de-shroud it and zip tie some proper 90mm fans, because the included ones are a piece of shit

    • #103224
      Anonymous
      Guest

      Ti would make more sense for the price

    • #103225
      Anonymous
      Guest

      https://i.4cdn.org/g/1632941237060.png

      You don’t need more than 1060 6GB for gaming. They will last through this decade until the integrated graphics catch up.

      Nvidia made a mistake when they decided they are only going to cater to 1000$+ market.

      • #103272
        Anonymous
        Guest

        Lmao those are prebuilts. Not even mentioning how worthy of disdain the average Steam user plebbitor scrotebrain it is on there, those are *literally* all the most common prebuilt skus
        >1060
        >1650 super
        >1660
        >1660 super
        >2060
        >soon 3060 and 3060ti
        It’s literally just because the average steam user is an absolute complete and utter scrotebrain and/or child who only buys prebuilts and thus tries getting the cheapest prebuilt they can get which results in all the 60 cards because that’s like 95% of all prebuilt sales. Likely the only one of those who even built and upgraded their own system is the 1070 owners.

        Also, just because it’s workable doesn’t mean it’s good. All those people are now or soon having to either turn down their settings or turn on FSR and 1080p sucks. After using 1440p I can’t bear to even look at that shit anymore. Know how you feel about 768p or 900p after upgrading to 1080p? Yeah that’s how I feel about 1080p now.

        You can get away with it on 1080p display but that’s also like saying the GTX 980 is fine, which gets similar performance to most of those cards listed (5500XT/1650 super/1060 6gb). So, yes, you can also get away with a GTX 970 or 980 or RX 580 in this day and age because most even modern games have that as recommended settings, but I’d still rather have a 5700XT at 1440p or RTX 3080 or RX 6800XT at 4k but well I’m not paying scalped prices. You can also get away with a 3770k. That doesn’t mean I’d not rather have a 3700x or 5600x or 5800x or something.

        • #103316
          Anonymous
          Guest

          So many words to justify buying a device for running modern AAA shitty games for scrotes and scrotes?

        • #103329
          Anonymous
          Guest

          >Lmao those are prebuilts.
          Lmao scrote, I had 1060 in all my builds since it released and was the best price/$ option and only upgraded CPU, mobo and storage.

          1060 can run any modern game no worse than PS5, and it will last until integrated graphics reach it’s level. There is literally no way Nvidia can force 80% of their market – people who bought GPUs under 300$ to buy their premium marketing shit way over 500$, let alone 800$ that they charge for 3060.

          They can raise their prices all they want, people will just sit on 1060 and then move to integrated graphics or consoles, there is just that much idiots with money, about 5% all of 30 series combined, that are willing to pay what Nvidia is asking.

      • #103319
        Anonymous
        Guest

        Wrong as always. You don’t even need the RTX line. Should have stopped at GTX. RTX is a meme and is just marketing to sell overpriced trash. SHINY LIGHTS!!! BUY NEW CARD NOW!!!

        Someone gets it. Stop freaking buying memes. You’re literally just throwing away money. You’re not impressing anyone.

    • #103226
      Anonymous
      Guest

      i need more for my 360 hz monitor

    • #103241
      Anonymous
      Guest

      It’s literally inferior to a $400 card from over 2 years ago. If you think a 5700XT isn’t enough then that definitely isn’t enough, and it sure as shit isn’t goddamn worth it if you’re not paying MSRP. Seriously a freaking 5700XT beats it.

      • #103245
        Anonymous
        Guest

        >No CUDA
        >No OptiX
        >Only 8GB of VRAM
        >Buggy as shit drivers, nonexistent studio drivers
        >Only edges ahead of the 3060 in gooms that aren’t AMD-sponsored
        No.

        • #103311
          Anonymous
          Guest

          >Optix
          Oh so you do programming and modeling on Windows?
          >CUDA
          Oh so you do professional work? It’s like mentioning tensor cores. You know why nobody brings that up? Because most people aren’t using it for professional machine learning work on Windows, they’re using it for freaking gaming. None of that is relevant if you’re not using it for your job.
          >VRAM
          >from nvidia
          Ayyyy lmao nice sub-16gb you got there. Nice 8gb 3070 and 3060ti you got there. Nice nearly any nVidia vs AMD launch you got there.
          >bbut muh drivers!
          >that gen
          You know why nobody complains about pic related bugs anymore? Because it eventually got fixed ages ago. My RDNA card doesn’t have any issues, it mines like a champ, and evidently still games for 2021 tier. This is not even mentioning thw fact not only are there no bugs on RDNA2, but Ampere had all kind of issues from bad drivers to bad POZZEDCAPS to certain games bricking 3090s.

          Also that’s average across all titles you abject moron. Unless you’re playing some very specific title optimized for nVidia that last gen AMD card beats it for only $400. Imagine paying $800 to get worse than last gen midrange AMD performance lmao.

          Now go sit on your "Genius Bar" cuck stool you scrotebrained Apple tier brandwhore scrote and stop pretending you give two shits about hardware performance because clearly you do not and solely care about buying the logo. You’re like some welfare hoodrat spending $300 on Nikes because you want the swoosh to show muh other siss.

          • #103332
            Anonymous
            Guest

            >buying ayymd garbage
            >in any year
            lol’d

            • #103336
              Anonymous
              Guest

              >Paying 800$
              >For a thing you can get for 400$

              I bet you scrote shill iphones here all day too.

          • #103339
            Anonymous
            Guest

            >>CUDA
            >Oh so you do professional work? It’s like mentioning tensor cores. You know why nobody brings that up? Because most people aren’t using it for professional machine learning work on Windows, they’re using it for freaking gaming. None of that is relevant if you’re not using it for your job.
            CUDA is literally used in most art products. 3D modeling, raster/vector editors, etc. will make use of CUDA. You’re just coping because you have nothing except gayming as a personality.

            • #103428
              Anonymous
              Guest

              >post says CUDA is for work
              >mentions art products aka work
              >mentions variety of use cases for work not games
              So in other words, you’re saying that post is entirely correct. How is that coping? You’re agreeing with the post and then coping about it.

              >buying ayymd garbage
              >in any year
              lol’d

              >imagine unironically buying shittier hardware for more money because of the logo
              You’re literally at the level of Macscrotes. In other words that post is also correct, you’re an Apple "genius" NPC when choosing hardware.

              So many words to justify buying a device for running modern AAA shitty games for scrotes and scrotes?

              Considering the nature of the OP, why are you even here? Then stick to your RX 480 or GTX 970 or whatever (probably the best decision in this market).

              >No CUDA
              >No OptiX
              >Only 8GB of VRAM
              >Buggy as shit drivers, nonexistent studio drivers
              >Only edges ahead of the 3060 in gooms that aren’t AMD-sponsored
              No.

              Steve does a good job explaining the VRAM situation although he doesn’t address one thing, which is that if you’re running lots of texture mods at very high resolution or playing 4k Doom Eternal or hell even Deus Ex Mankind Divided at 4k and you’re gonna need a hell of a lot more than 8gb. DE:MD fills 8gb @ 1440p. Doom Eternal’s recommended reqs are a Radeon VII or RTX 2080ti because you’ll notice each of those cards has at least 11gb of VRAM, which that game absolutely will chew through.

              However this is quite dumb as the 3060 is absolutely not a 4k card, in fact it barely works at 1440p on some titles. To take advantage of its marketing gimmicks like RT, your res drops to 1080p, which it does worse than a 2070s, and only making it better value in the sense 2070s MSRP was very high but Ampere MSRP is price gouged too. So the VRAM isn’t really useful and ironically while the 3060ti has faster VRAM it’s still 8gb which is because neither card is designed for 4K where it’ll run out of VRAM on some titles. Its performance is slotted below the 2070s, 1080ti, and 5700XT.

              • #103434
                Anonymous
                Guest

                I’m on a 6800 XT at 1440p and the most VRAM it’s ever used was 9GB in Horizon Zero Dawn and that game had a blatant memory leaking issue. Not sure where this VRAM fearmongering comes from

                • #103481
                  Anonymous
                  Guest

                  >6800 XT at 1440p
                  Let’s clarify what was written and what you missed. 83595895 Talked about 4k textures not 1440p. Your experience is 1440p, which of course will use around 6-8GB. However, they discussed 4k and memory requirements.

                • #103498
                  Anonymous
                  Guest

                  So in other words even at 1440p you’re running out of VRAM. Ok?

                  At 4k you will need much more which is exactly where cards like 3080 and 6800XT are slotted in, and which is why the 3080 really isn’t built to last but designed as usual to eventually run out of VRAM in a few years thus pushing you to upgrade whether you wanted to or not. GTX 980 and 970 suffered the similar problem, with GTX 980ti 6gb VRAM being just enough for the new standard which was 6-8gb needed for new games at 1080p. Every nvidia gen 80ti is basically designed for the new standard of VRAM which is in this case 10-12gb at ultra high resolutions. At 4k any triple A shitfest is going to chew through that VRAM. 8gb is needing to bump textures down or suffer offloading to sys RAM, crashing etc

              • #103561
                Anonymous
                Guest

                >art is only work
                Do I need to say again that the only personality trait you have is gayming?

                • #103563
                  Anonymous
                  Guest

                  It’s what’s called the general use case of "productivity"
                  All you’re doing is cementing that point that was made previously

          • #103415
            Anonymous
            Guest

            Some of us have these things called "hobbies", anon. Gaming isn’t a hobby. It’s sad that that concept is lost on you.

      • #103283
        Anonymous
        Guest

        >1080p benchmarks

        • #103365
          Anonymous
          Guest

          Anyone considering this card should watch this first
          https://www.youtube.com/watch?v=3C-RoDtqdJ8

          ayyy lmao he doesn’t actually check hardware reviews before buying memes
          >1080p doesn’t matter
          https://www.youtube.com/watch?v=YY7P_cF9qE8
          >high resolution doesn’t matter
          >thermals don’t matter
          >cost doesn’t matter
          >gaming performance doesn’t matter
          Even at 1440p it loses.
          https://www.youtube.com/watch?v=3C-RoDtqdJ8
          >bbbut at least my EVGA 3060 OC can match and sometimes even beat 5700XT Taichi at stock with massive overclocks at ultra high resolution!
          Pic related.

          3060 is not a 4k card, and at 4k RT on a 3060 doesn’t matter because it’s literally unplayable, and is also shit with RT at 1440p because it’s worse than a 2070 super or 5700XT and thus runs RT like ass above 1080p.

          • #103368
            Anonymous
            Guest

            >That second video
            >Those graphics glitches
            >Those stutters
            >Those drivers
            >Calling that a win for the 5700XT
            Embarrassing.

          • #103432
            Anonymous
            Guest

            >ayyy lmao he doesn’t actually check hardware reviews before buying memes
            It’s your job to present your argument in a non-scrotebrained manner. Next time bring your best foot forward.

            • #103472
              Anonymous
              Guest

              You got btfo with incredible ease OP:
              https://www.youtube.com/watch?v=3C-RoDtqdJ8
              Proofs and citations had already been provided. It’s LITERALLY your exact card which you decided to go with and post for the OP. You’ve already been given the in-depth review of that exact card in a pretty objective and scientific manner. It’s just an overall inferior card with very little generational uplift considering the price and basically just has better thermals than last gen’s midrange AMD card while performing much worse even at higher resolutions, which is just sad.

              To get mildly better performance than last gen’s midrange AMD card you get to pay the exact same MSRP of $400 for a 3060ti now or a 5700XT two years ago–which is assuming MSRP. Paying any more than that is getting openly cucked and robbed.

              Paying $800 for a card which performs worse than a card that AMD released two years ago for half that price? Now that’s just fuckin sad man.

              • #103477
                Anonymous
                Guest

                I’m not commenting on OP.
                I don’t give a fuck about the 3060. I have a 3080.
                I’m calling (You) a scrote because you used 1080p benchmarks to try to prove a point for graphics cards.

                • #103528
                  Anonymous
                  Guest

                  >1080p doesn’t matter
                  See

                  You got btfo with incredible ease OP:
                  https://www.youtube.com/watch?v=3C-RoDtqdJ8
                  Proofs and citations had already been provided. It’s LITERALLY your exact card which you decided to go with and post for the OP. You’ve already been given the in-depth review of that exact card in a pretty objective and scientific manner. It’s just an overall inferior card with very little generational uplift considering the price and basically just has better thermals than last gen’s midrange AMD card while performing much worse even at higher resolutions, which is just sad.

                  To get mildly better performance than last gen’s midrange AMD card you get to pay the exact same MSRP of $400 for a 3060ti now or a 5700XT two years ago–which is assuming MSRP. Paying any more than that is getting openly cucked and robbed.

                  Paying $800 for a card which performs worse than a card that AMD released two years ago for half that price? Now that’s just fuckin sad man.

                  4k, 1440p, and 1080p which I don’t have any clue your issue including 1080p considering the 3060 non-ti is basically a high refresh 1080p card. On some modern titles like say AC Valhalla it gets sub-50fps performance at 1440p. It should still work fine for most older titles at 1440p but it’s obviously not totally designed for it, and with raytracing enabled absolutely isn’t capable of being a 1440p card. That cutoff is like the 3060ti, or even 1080ti, 2070s, 5700XT or whatever. Below that performance it will start to struggle at 1440p in fact those cards can struggle with 1440p on certain brand new releases.

              • #103479
                Anonymous
                Guest

                >$800
                >I LITERALLY CANNOT THINK ABOUT ANYTHING BEYOND SCALPER PRICES AND GAMING
                The 3060 is $340. The 3060Ti is $399. 5700 XTs were going for unreasonable prices second hand before Ampere even launched. AC Odyssey is an AMD sponsored title. Hitman 3 uses FidelityFX. Red Dead Redemption 2 is an AMD sponsored title. Strange Brigade is an AMD sponsored title. Cyberpunk 2077 is an AMD sponsored title. Every other chart you’ve schizophrenically stitched together is either within generational margins or doesn’t feature the 5700 XT at all.
                Beyond that, there’s other uses for GPUs that aren’t gaming. Just because a Radeon GPU is good value for you doesn’t make it good value for everyone else.

                • #103490
                  Anonymous
                  Guest

                  not him, but to be clear, those are FE prices
                  if you want triple fan you need about $50 more. It might not be necessary for the 3060 but I might want that for the Ti

                • #103527
                  Anonymous
                  Guest

                  Holy shit lmao
                  >Cyberpunk is an AMD sponsored title
                  >everything is a sponsored title
                  https://www.techspot.com/article/2164-cyberpunk-benchmarks/
                  >sponsorship doesn’t matter
                  >performance doesn’t matter
                  It’s like you’re strawmanning yourself to make nvidia buyers look like complete coping imbeciles. Cyberpunk is nVidia sponsored you complete freaking moron.
                  Cope harder now go sit on the bottle.

      • #103410
        Anonymous
        Guest

        I don’t care about FPS I am using mine as a deep learning workstation and that requires CUDA and RTX

        • #103494
          Anonymous
          Guest

          Well there you go then. Good for you you have a professional use case.

          However most people looking at these are not doing it for productivity. Most of them are gaymers and exactly the type od scrotebrained people corpo propaganda works on, who see a scroteship 80ti tier card and associate "wow! nvidia so fast! Much fps!" and then spend over a hundred dollars more on similar or outright inferior performance compared to AMD because of the logo. It makes me wonder what kind of scrotebrain is going to do the same with Intel and pay Intel price gouged prices because "wow! Intel so fast! Much fps gains! Shiny known corporate logo!" I guarantee you Intel GPUs will be overpriced and run like ass but scrotebrained gaymer children will buy them anyway.

      • #103529
        Anonymous
        Guest

        The 30xx cards are better at higher resolutions.
        Thats all.

        • #103558
          Anonymous
          Guest

          Yes. This is true and those are also included above, and even with that in mind the 3060 still loses to the 5700XT at those resolutions for the most part. It’s not quite as bad as the "1060" 3gb vs 1060 6gb, but between 3060 and 3060ti it’s not that far off. It even manages to fall behind the 2070 super in raytracing. This makes it overwhelmingly a bad deal and yet in spite of that it’s going to become the most popular new card on steam survey because most people are idiots.

    • #103276
      Anonymous
      Guest

      Given: I need more.
      Assume I don’t need more. Contradiction. Therefore, I need more.

    • #103280
      Anonymous
      Guest

      Bought a 3080 anyway.

    • #103286
      Anonymous
      Guest

      I need 8K 360Hz ULTRA RAY TRACING XTREME WITH 152 TExture packs

    • #103321
      Anonymous
      Guest

      The daily poorscrote cope thread has arrived.

    • #103342
      Anonymous
      Guest

      >need
      SHALL NOT BE INFRINGED

    • #103404
      Anonymous
      Guest

      I love EVGA. But I won an MSI RTX 3069 Ventus X3 and I am pleased by the thermal performance of 3X fans. Also, not a single scrote rainbow light for my workstation.

      • #103419
        Anonymous
        Guest

        a freaking 3060 doesn’t need a tri axial cooler. this is just price gouging on their part. fuck every AIB who does this

        • #103421
          Anonymous
          Guest

          >a freaking 3060 doesn’t need a tri axial cooler.
          You say that but it never goes over 50c at full load. That’s a win in my book!

    • #103417
      Anonymous
      Guest

      I have an RTX 2070. That is not an upgrade. I need more.

    • #103439
      Anonymous
      Guest

      Right now i need less, a lot less.

    • #103442
      Anonymous
      Guest

      Im not buying a GPU until it can render 90 percent of my games at 4k60 for under 130 dollars. I can wait, since tech and the hardware to get there is going to take forever anyways.

      • #103475
        Anonymous
        Guest

        At that price? ahahahahaha freaking NOPE.

        Sorry m8 but you’re completely bonked if that is your budget. At present prices that will get you a, what, RX 550? GTX 770 or something? Considering not just inflation but the massive hand rubber price gouging of these companies fueled by the absolute scrotebrains buying scalper priced GPUs right now, it’s never going to lower even to half that cost again. 4k60 bottom end cards next gen are going to be $400-500 for a xx50ti SKU. 7500XT will be like $450. 7700XT/RTX 4070 will probably do 60fps on all titles and probably be good for 144hz 4k monitors on most non-AAA 2023 titles though it’ll likely be $600 MSRP at least. Thank all the idiots buying these things right now like OP for that.

    • #103483
      Anonymous
      Guest

      before upgrading I always ask the question have you finished Black Ops 3 and Modern Warfare remastered yet? the answer is no I didn’t ive been just watching youtube and listening to itunes..I still played Splitgate at 60 fps max settings on 1080p with no issues so why upgrade?

      • #103530
        Anonymous
        Guest

        You’re a wise man

    • #103487
      Anonymous
      Guest

      The only thing I don’t need more of is these scrote freaking threads, goddamn.

    • #103501
      Anonymous
      Guest

      The 3060 is a laughably bad card. It barely belongs in the same series as the 3060 Ti and up. It’s not even the full GPU, just to garden gnome you that little bit more when you’re already taking a massive hit compared to the 3060 Ti.

    • #103514
      Anonymous
      Guest

      I’d go even lower. You don’t "need" more than an IGP.
      If you think you "need" more, you’re a goddamn gamer who needs to remove himself from the gene pool pronto. Not to mention you can play any worthwhile games on it anyway.

    • #103531
      Anonymous
      Guest

      The bar is still set too high as something you don’t need more. It would be already a nice-to-have for many users at this point.

    • #103534
      Anonymous
      Guest

      Since I have attained the status of a 40 year old wizard and my vision is deteriorating, I doubt I will ever upgrade to a higher resolution than 1080p. Therefore the 3060 is the perfect card for me. I also feel superior because it has 12gb of 192-bit GDDR6 compared to the 8gb of 128-bit GDDR6 that my RX 5500XT had. What disappoints me is that my RX 5500XT was only $179 after MIB and the RX 3060 was $500.
      But I am also fully away that every other damn desirable thing has doubles in price as well.

      • #103537
        Anonymous
        Guest

        I also want to add that I have my RX 5500XT up for sale for $500 and I am confident I will get that for it

    • #103553
      Anonymous
      Guest

      >another "you don’t need more" post
      >the RTX 3060

      • #103556
        Anonymous
        Guest

        you really don’t need more than the R5 3600 and the RTX 3060 and 32gb of DDR 3600 CL16

        • #103560
          Anonymous
          Guest

          His point was that "you don’t need more" is generally like 16gb DDR3, 3570k/3770k/2600k OCd or something like that with a GTX 970, 980, 1060, RX 580 etc. Which is also true. You generally will be fine on that setup with 1080p. Most people don’t need anything like the newest GPUs. It only starts mattering at very high resolutions with brand new titles. A GTX 980 FE card could easily get you through 2022 if you’re below 1440p. Otherwise just turn a couple settings down or deal wirh 50fps it’s not going to kill you.

          • #103562
            Anonymous
            Guest

            yeah I get the meme but the 3600 is the new 2500K (if you buy that theory)

        • #103570
          Anonymous
          Guest

          you don’t even hold 144 fps on Dota of all things with a R5 3600/3600CL16 RAM, let alone more demanding games

    • #103567
      Anonymous
      Guest

      i got a 3090 for free so i don’t care about your poorscrote opinions

      • #103572
        Anonymous
        Guest

        >got it for free
        >calls people poorscrotes
        Even someone who bought a shitty $150 card is proving wealth more than you, neet. I’ll be impressed when you do something with your life.

    • #103575
      Anonymous
      Guest

      get an OG 1080 and chill.

    • #103589
      Anonymous
      Guest

      I’m happy with my 980 ti. Shrugs.

    • #103590
      Anonymous
      Guest

      i dont need more than integrated graphics, anon. im not a gaymer

Viewing 27 reply threads
  • You must be logged in to reply to this topic.