DLSS 3

apologize.

  1. 3 weeks ago
    Anonymous

    no

  2. 3 weeks ago
    Anonymous

    >made up frames
    >input lag
    tranny technology

    • 3 weeks ago
      Anonymous

      don't worry, everyone will get tranny tech soon enough

      • 3 weeks ago
        Anonymous

        Those pictures look exactly the same

        • 3 weeks ago
          Anonymous

          Perhaps because interpolation between what is essentially 2 almost identical still images of the character standing still& looking in the distance is trivial?

        • 3 weeks ago
          Anonymous

          That's the point nagger, same quality for 3x performances

          • 3 weeks ago
            Anonymous

            rude

  3. 3 weeks ago
    Anonymous

    Boy I sure do love me some buzzword wars.

  4. 3 weeks ago
    Anonymous

    NVIDIA are the new apple

    • 3 weeks ago
      Anonymous

      then I kneel

      • 3 weeks ago
        Anonymous

        AMD has better support for Nvidia cards than Nvidia has for Nvidia cards

        • 3 weeks ago
          Anonymous

          Do amdfags actually believe this? Try using an older amd card and see if it's supported. Nvidia has insanely good driver coverage, even on Linux as long as you aren't a Foss fanatic and use a stable distro without monthly kernel updates.

          There's a reason most non consumer GPU farms use Nvidia and not it's not just because of CUDA.

    • 3 weeks ago
      Anonymous

      They have been since 2010s. Leather Jacket Daddy wants to be the second coming of Steve Jobs.

  5. 3 weeks ago
    Anonymous

    I sleep

  6. 3 weeks ago
    Anonymous

    >my gpu is not supported
    nah

  7. 3 weeks ago
    Anonymous

    >there's people playing their games with upscaling, interpolation and high input lag

    We truly are in the end times.

    • 3 weeks ago
      Anonymous

      > AI upscaling looks better than native rendering
      > lower input lag than native even with frame gen because GPU is rendering at a lower resolution

      anti-AI fags get the rope

      • 3 weeks ago
        Anonymous

        >false
        >false

        I don't want what you're selling. inb4 cherry picked examples

      • 3 weeks ago
        Anonymous

        > lower input lag
        > DLSS
        How many leather jackets do you own?

        • 3 weeks ago
          Anonymous

          the absolute state of brainlets, if you render at a lower resolution, you get more frames, and therefore lower input lag

          • 3 weeks ago
            Anonymous

            Anon, the fake frames are INTERPOLATED, not extrapolated.

            • 3 weeks ago
              Anonymous

              I was referring to DLSS upscaling, which reduces latency.

              DLSS frame gen (interpolation) obviously increases latency, but if you use it along with DLSS upscaling, you get a comparable or even lower input latency compared to native

        • 3 weeks ago
          Anonymous

          I was referring to DLSS upscaling, which reduces latency.

          DLSS frame gen (interpolation) obviously increases latency, but if you use it along with DLSS upscaling, you get a comparable or even lower input latency compared to native

          just wait for the next feature. AI input generation, guessing what input you would make and feeding it in to reduce input latency

          • 3 weeks ago
            Anonymous

            > AI input generation
            > leads to negative input latency
            > time travel created inside GPU creates a singularity that destroys planet earth
            t-thanks Nvidia

          • 3 weeks ago
            Anonymous

            zoomertoddlers would love this since its not like they can make any conscious decisions anyway and would rather the "game" play itself for them

      • 3 weeks ago
        Anonymous

        False. I tried DLSS and it's a blurry mess with frames that look phony af. I can tell when an Image is up scaled specially cause the text looks horrible. Most games that use DLSS or FSR even upscale the UI which is abusurd. The ones that only use if for texture resolution are a bit better but they still look worse than native.

        • 3 weeks ago
          Anonymous

          4K DLSS Quality is objectively better than native 4K, this is a scientific fact that was tested many times

          • 3 weeks ago
            Anonymous

            Except no one plays at 4k......

            • 3 weeks ago
              Anonymous

              I only play games in my 55 inch 4K 120hz micro-LED Samsung TV

              • 3 weeks ago
                Anonymous

                congrats you're the perfect candidate for dlss

          • 3 weeks ago
            Anonymous

            > native 4K
            Do any modern games even do native? I thought they all had mandatory TAA or some other kind of Vaseline filter.

            • 3 weeks ago
              Anonymous

              every game has the option to render natively, not sure what you're talking about

          • 3 weeks ago
            Anonymous

            Your proof is static images. Try actually playing a game, once there is motion native is clearly better. You are a fucking retard.

            • 3 weeks ago
              Anonymous

              >once there is motion native is clearly better
              you're thinking o FSR. DLSS 2+ looks great even at motion

        • 3 weeks ago
          Anonymous

          You clearly never tried it at 4K quality or you think FSR looks the same as DLSS which isn't anywhere close to DLSS output. DLSS is great, FSR is a blurry mess similar to TAA.

          Your proof is static images. Try actually playing a game, once there is motion native is clearly better. You are a fucking retard.

          Wrong, 4K DLSS quality looks better than native because of shitty temporal AA techniques. You're the fucking retard.

    • 3 weeks ago
      Anonymous

      oh maw gaedwd ees that a bermintide reference??!?!

  8. 3 weeks ago
    Anonymous

    how does nvidia manage to get their fangirls to worship them when the launch tech that essentially screws them over?

    • 3 weeks ago
      Anonymous

      By repeating the word AI over and over.

  9. 3 weeks ago
    Anonymous

    sorry I'm too poor to understand this, explain?

    • 3 weeks ago
      Anonymous

      Basically, instead of marking more powerful GPUs, nvidia has started to upscale frames from low resolution, create interpolated frames from low framerates and even use fake (AI guesstimated) rays in raytracing.

      • 3 weeks ago
        Anonymous

        don't forget to mention AMD is doing the exact same shit, but at a lower quality

        • 3 weeks ago
          Anonymous

          Which is sad. But you don't want to lose market of retards.
          Still, I'll give kudos to AMD even if only for HIP and ROCm.

        • 3 weeks ago
          Anonymous

          indeed but they're just following nvidia since its selling
          also amds can be used on any afaik

          • 3 weeks ago
            Anonymous

            >can be used on any afaik
            only because it looks like dogshit

  10. 3 weeks ago
    Anonymous

    being a 4090 owner I have no need to apologize for a poorfag solution I will never use

  11. 3 weeks ago
    Anonymous

    >play star field
    >runs like shit even worse because dlss3 is a thing now so you can run the most optimized garbage, and run the cope setting.

    • 3 weeks ago
      Anonymous

      Star homosexual doesn't officially support DLSS tho, modders had to step in

    • 3 weeks ago
      Anonymous

      Eh, I got it for free as part of my overpriced GPU.
      Gonna play it in a couple of months when the bugfixes from modders are out.
      Still, I'm not very interested when there's no lizard pussy in this game

    • 3 weeks ago
      Anonymous

      >Runs fine on Radeon
      Nvidia didn’t allocate the resources to make Starfield run well. israelitevidia is an AI company now, gamers will just have to buy AMD and Intel.

  12. 3 weeks ago
    Anonymous

    FSR 3

  13. 3 weeks ago
    Anonymous

    I only have a 3080 so DLSS3 can go fuck itself.

    • 3 weeks ago
      Anonymous

      AMD will give you fsr3 so you're chilling

      • 3 weeks ago
        Anonymous

        fsr is garbage just like any other amd technology

  14. 3 weeks ago
    Anonymous

    >we can't make chips fast enough to render stuff smoothly so we'll cheat and upscale instead
    sad

  15. 3 weeks ago
    Anonymous

    Name one good game that makes use of it

    • 3 weeks ago
      Anonymous

      starfield

      • 3 weeks ago
        Anonymous

        Thanks for the horrifying npcs in 4k nvidia

      • 3 weeks ago
        Anonymous

        >trannyfield
        >good game

  16. 3 weeks ago
    Anonymous

    >30 years of GPU innovation
    >fake frames

    • 3 weeks ago
      Anonymous

      Like it or not, AI is the future. Why the fuck would you keep bruteforcing shading every single fucking pixel more times per second when AI is becoming advanced enough to just imagine the inbetween frames? when it can imagine an upscaled image of superior quality?

      • 3 weeks ago
        Anonymous

        Wrong, it is just Nvidia's way of financing their ML-tier hardware. They are using gaymers as the dumping ground for yield and design rejects.

        • 3 weeks ago
          Anonymous

          >t. AMDrone

          • 3 weeks ago
            Anonymous

            >denying reality
            2070 super has 2x the transistors as a 1080 despite the exact same core count and bus width, they could be doubling performance every generation but instead they just add more useless AI cores

            • 3 weeks ago
              Anonymous

              They're betting on AI/ML being the future, I tend to agree with them. Have you even been paying attention to the progress in ML these past years?

              • 3 weeks ago
                Anonymous

                its called a video card, they save money on professional cards by reusing dies.

                thats great about ML but guess what? its gay as fuck I have to spend $1500 on a 4090 for good stable diffusion performance. If they made a die that was only tensor cores it could have half the power consumption and half the size, or be the same size and at least twice as fast. They're literally fleecing both markets at once

          • 3 weeks ago
            Anonymous

            >Nvidiot who too stupid not to see truth that right in front of them
            Nvidia knows that graphics in general is a mature market with no growth but only inevitable decline as demand destruction from iGPU makes discrete SKUS redundant for more and more customers.
            If your engineers can make ML/datacenter ASICs do graphics on the same that keep up with the compention despite this setup? Why not do it if it dramatically cuts down on R&D costs and time?

  17. 3 weeks ago
    Anonymous

    Im sorry youre all too retarded to stop giving money to nvidia

  18. 3 weeks ago
    Anonymous

    >input lag
    no, and kys

  19. 3 weeks ago
    Anonymous

    I'm sorry. I only use nvidia bullshit because AMD sucks for ML

  20. 3 weeks ago
    Anonymous

    Looks like shit and I haven't owned an AMD/ATI card since 4890.

    • 3 weeks ago
      Anonymous

      >Looks like shit
      [citation required]

      • 3 weeks ago
        Anonymous

        Source: EVERYONE THAT HAS A PAIR OF FUCKING EYEBALLS

        • 3 weeks ago
          Anonymous

          then point me to the DLSS artifacts, you can use this video for reference:

          • 3 weeks ago
            Anonymous

            >dude find me the artifacts in this artifacted VP9 compressed video
            not the sharpest cookie here huh

            • 3 weeks ago
              Anonymous

              >Posts compressed video as proof
              lulz

              so you're saying your imaginary DLSS 3 artifacts are so small and insignificant they're completely erased with video compression?

              I love DLSS now!

              • 3 weeks ago
                Anonymous

                2560x1440x8x3x120=9.88Gbps
                Even with the most efficient lossless video codecs, you'd need a bitrate of ~450-500mbps.

              • 3 weeks ago
                Anonymous

                >500 megabits
                ironic when GPUs can process terabytes worth of data per second

              • 3 weeks ago
                Anonymous

                Are you retarded or what? I'm talking about the captured video.

          • 3 weeks ago
            Anonymous

            >Posts compressed video as proof
            lulz

  21. 3 weeks ago
    Anonymous

    I'm sorry for shidding in your shoes and for calling you not a real woman, DLSS3. You are the naggerkike of all naggerisraelite. I'm sorry for not being racist, misogynist and meat eating ENOUGH, I will do better from now on. Praise Odin. Death to Zog. Yeet trannies.

  22. 3 weeks ago
    Anonymous

    I jump back and forth from AMD to NVIDIA depending on the best bang for buck when i pull the trigger, fanboying is wasteful

  23. 3 weeks ago
    Anonymous

    If I had an RTX 40 series GPU I would go try it right now but I don't so I cannot formulate an opinion on it visually and tolerance with input latency other than seeing videos of it in use

  24. 3 weeks ago
    Anonymous

    >making an excuse for upscaling
    Yes, you do need to apologize.

  25. 3 weeks ago
    Anonymous

    I kneel.

  26. 3 weeks ago
    Anonymous

    remember when nvidia made graphics chips? those were pretty good.

  27. 3 weeks ago
    Anonymous

    I can't wait for dlss4 to be rtx5000 exclusive and in barely 10 games I don't play

  28. 3 weeks ago
    Anonymous

    DLSS 3.5 with ray tracing actually improves visual quality.
    It gets rid of the denoiser and uses the tensor cores instead.
    The result is higher detail GI not achievable with any other technique.

    The level of AMDdrone cope must be insufferable at this point.

    • 3 weeks ago
      Anonymous

      Why are all the colors so wrong compared to the reference image?

      • 3 weeks ago
        Anonymous

        The tube with the panels on top are light sources and the colored highlights are a result.
        DLSS off is not a reference image but a biased real time global illumination engine, so called """path tracing""" (it's not) with a denoiser running on cuda cores.
        This engine is not capable of producing a reference image but the one of the right would be closest if there was one. So if you were to render this in an actual offline path tracer the DLSS 3.5 image would be more similar to it than anything else.

        srsly learn about the basics of light transport methods before posting.

        • 3 weeks ago
          Anonymous

          > reference image is not a reference image
          Okay.
          > it would have been the same as DLSS 3.5
          How about you present an actual properly rendered and traced image and then start talking about how DLSS compares? Because by now a properly traced one looks nothing like the guessworked upscale by nVidia.

          • 3 weeks ago
            Anonymous

            Learn what reference image means in rendering retard.
            Learn the difference between biased and unbiased rendering.

            • 3 weeks ago
              Anonymous

              > no "reference"
              > but it would totally look like our guesstimate slop
              Sure thing, rabbi.

              • 3 weeks ago
                Anonymous

                You could have googled what unbiased rendering means and but you didn't.
                Oh well I tried.

              • 3 weeks ago
                Anonymous

                > you can google
                Lol, should I "do my own research" and you will not "waste time"?
                Point is there is no "reference" according to you, so any bullshit about how it would totally look like Nvidia's slop because it's just so good, even Jensen said so is nothing more shilling.

              • 3 weeks ago
                Anonymous

                I don't think you know what a denoiser is.

      • 3 weeks ago
        Anonymous

        It's not a reference, it's the old raytracing without scaling - the image itself is just supposed to show the evolution of the DLSS feature set, not how the new integrated denoiser improves quality.

        • 3 weeks ago
          Anonymous

          > supposed to show the evolution
          > not a reference
          Why don't they show it in comparison to a properly rendered reference frame then?

          • 3 weeks ago
            Anonymous

            Because the purpose of this slide was feature set and performance, other slides show quality improvements (using the same raytracing model but with the old/new denoiser).

            • 3 weeks ago
              Anonymous

              > the purpose
              To show that it's supposedly better, since it's an advertising material. I'd like to see how it looks compared to the proper rendered reference image, so I can see which one is actually closer to the ground truth.

          • 3 weeks ago
            Anonymous

            There are comparisons on this page, only 2 are what you are looking for (the car headlights and reflections):
            https://www.nvidia.com/en-au/geforce/news/nvidia-dlss-3-5-ray-reconstruction/

            > the purpose
            To show that it's supposedly better, since it's an advertising material. I'd like to see how it looks compared to the proper rendered reference image, so I can see which one is actually closer to the ground truth.

            >I'd like to see how it looks compared to the proper rendered reference image, so I can see which one is actually closer to the ground truth.
            It's a denoiser, it doesn't change the look, only the quality of the reconstruction.

            • 3 weeks ago
              Anonymous

              > no comparison of the same image with 3.5
              What a coincidence.

              • 3 weeks ago
                Anonymous

                There are 2, with the old denoiser showing significant artifacting.

              • 3 weeks ago
                Anonymous

                Maybe I'm retarded, but can you copy it here?
                I see a pic of headlights with reference and shitty denoiser, but not with DLSS 3.5

              • 3 weeks ago
                Anonymous

                Keep scrolling down (or ctrl-f for "In the following scene from Cyberpunk 2077, the inaccurate headlight illumination"). But yes, you can use that small snippet if you want to compare to an actual reference.

              • 3 weeks ago
                Anonymous

                That one compares to a different DLSS version and not to the reference.
                It's all very misleading.

              • 3 weeks ago
                Anonymous

                It's comparing to CPDRs denoiser.

                If you just don't believe in their denoiser tech, you can read the papers it's based on:
                https://research.nvidia.com/publication/2021-07_rearchitecting-spatiotemporal-resampling-production
                https://research.nvidia.com/publication/2022-07_generalized-resampled-importance-sampling-foundations-restir
                https://research.nvidia.com/publication/2023-03_joint-neural-denoising-surfaces-and-volumes

              • 3 weeks ago
                Anonymous

                > if you don't believe then
                They could simply show the proper reference, old tech and new tech side by side. But they only show reference <-> old and new <-> old (on different scenes, of course, so no direct comparison can be made).
                Smells extremely fishy.

              • 3 weeks ago
                Anonymous

                Because it's an ad for an unreleased game not a paper, and improvements over the old method are obvious.

              • 3 weeks ago
                Anonymous

                > because it's an ad
                No shit. How convenient that there is no clear comparison. But obviously, the newest is much better than real, just trust us, guys. No, we won't show it, stop being antisemitic.

              • 3 weeks ago
                Anonymous

                They don't claim it's better than fully resolving - I have to ask again, do you know what a denoiser is?

              • 3 weeks ago
                Anonymous

                > They don't claim
                Yet retards in this thread do and say that it's way closer to the "true" version without showing any such version in comparisons.

              • 3 weeks ago
                Anonymous

                It is closer to the true version, because the old version has significant artifacting, and their denoiser has much less artifcating. You don't need many comparisons when the flaws are so great and well know.

              • 3 weeks ago
                Anonymous

                > it's closer to the truth
                > you don't need many comparison
                I'd be okay with a couple. Hell, even a single one that compares "truth", old and new directly and clearly on the same scene. Preferably the one they like to tout with the pink coloration.

              • 3 weeks ago
                Anonymous

                Also, bonus points if they can prove that the true image was not used as part of the training set.

  29. 3 weeks ago
    Anonymous

    Only if it gives me a solid 75fps on high/ultra and I can't tell the difference between native and DLSS Quality at 1080p.

  30. 3 weeks ago
    Anonymous

    Fuck no

    • 3 weeks ago
      Anonymous

      turn off vegitation and set shadows to mid and turn off DLSS.

      • 3 weeks ago
        Anonymous

        >turn off DLSS.
        No fucking shit sherlock

    • 3 weeks ago
      Anonymous

      Stop using DLSS frame generation.

  31. 3 weeks ago
    Anonymous

    >game developers will get lazier with optimising their products and rely on gpu tech to make them playable
    >nvidia will continue to kneecap gens of cards by not allowing the new dlss versions to be on previous cards ensuring the consumer has to pay for yet another inflated cost priduct
    Gamers may be more stupid than cryptards

    • 3 weeks ago
      Anonymous

      >game developers will get lazier with optimising their products and rely on gpu tech to make them playable
      DLSS and FSR are the solutions to that not the cause of it.

      • 3 weeks ago
        Anonymous

        The actual cause is the increased reliance on third party assets.
        Before game assets were used to be made in house, which meant they could re-use textures and shaders.
        Today studios rely on importing assets from a variety of sources all come with their own textures and custom shaders.
        That means more vram requirement and less a less efficient rendering pipeline due to the large amount of shaders.

        • 3 weeks ago
          Anonymous

          Most shaders are shared, textures were rarely ever reused.

          • 3 weeks ago
            Anonymous

            Depends on how far you go back. The trend started in the 2010s before that there definitly was texture re-use.
            Furthermore shaders re-use can happen on a low level with high level representations being different. But this still requires a bunch of context switching which is more expensive on a GPU than on a CPU.

            • 3 weeks ago
              Anonymous

              >Depends on how far you go back. The trend started in the 2010s before that there definitly was texture re-use.
              Textures rarely were, and it was dependent on artist not copying textures outside of the engine (there are duplicate textures in Quake, for example). This system is no different now.
              >Furthermore shaders re-use can happen on a low level with high level representations being different. But this still requires a bunch of context switching which is more expensive on a GPU than on a CPU.
              That's because shader graphs were given to artists, who will create slightly different and incompatible shaders, but this is internal to companies (e.g. most skins in Fortnite have unique shaders), not to external content libraries (such as Megascans) which will share shaders or have shaders simple enough to be converted with a script.

  32. 3 weeks ago
    Anonymous

    >We could do the math and come up with the right answer but instead we will just have an AI take a guess and we'll assume that it's right.

    • 3 weeks ago
      Anonymous

      You are mistaken, and sorely misunderstand why denoisers are necessary, if you think a proper solution is available.
      Here's is Intel's very similar solution, which does compare to a reference, though that reference is also noisy:

      • 3 weeks ago
        Anonymous

        why do they always insist on using RNG for rays? half the problems with the 'noisy' image is that the same pixel is constantly changing despite the character and world being still

  33. 3 weeks ago
    Anonymous

    >entire thread of poorfags coping
    Honestly i doubt i would ever use it because i have a 4090 but i'm very impressed. If you haven't tried it your opinion is kind of irrelevant.
    Cyberpunk with path-tracing is actually impressive - looks amazing while running at 120fps. With this newest 3.5 update i don't notice any artifacts anymore, it's pretty much flawless. Yes, CP2077 is shit, but it's a gorgeous tech demo.
    It sucks that proprietary software is really superior and if you want to boycott nvidia for its practices and monopoly go ahead. That doesn't change the fact that frame-generation, at least nvidias, is really impressive. AI just makes sense to save on ressources and get better looking games.

  34. 3 weeks ago
    Anonymous

    Re DLSS "ray reconstruction" denoiser. DLSS now does raytrace denoising and upscaling in one bigass kernel, rather than separate steps. This is guaranteed to improve quality if implemented correctly, due to less information loss, if you were going to use upscaling anyway.

  35. 3 weeks ago
    Anonymous

    https://videocardz.com/newz/nintendo-switch-2-allegedly-ran-matrix-awakens-ue5-demo-powered-by-nvidia-dlss-during-private-gamescom-showcase

    NVIDIA DLSS WON

    THANK YOU NVIDIA DLSS

Your email address will not be published. Required fields are marked *