When will Linux get support for HDR? Posted on July 2, 2022 by Anonymous When will Linux get support for HDR?
SDR looks better in that image. the HDR part seems saturated
That's all HDR is.
t. doesn't know what hdr is
hdr is literally saturation and glow. it existed even in fallout 3, you fucking retard. in fact, you are forced to play fallout 3 with hdr on or else you wont get nuka cola quantums to do their signature blue glow and will look like normal nuka colas. oh and btw, you can play fallout 3 in wine and turn hdr on, so it works on linux, you absolute brainlet
Different HDR. OP is talking about color with more than 256 levels of brightness, which requires software+GPU+monitor support. LULZ does not support any HDR image format, so there is no HDR in this thread and the comparison is fake.
In game HDR hardly has anything do with HDR in displays.
Fallout may try to emulate an HDR image by making lights especially vibrant but the HDR being talked in this thread involves displays and the OS support necessary to enable HDR which changes the fundamentals of digital video.
HDR uses the perceptual quantizer rather than the "gamma curve" as the electro to optical transfer function.
HDR uses 10 bit and 12 bits per channel color and the Rec 2020 color space. Regular video assumes 8 bit per channel and sRGB color space.
HDR also requires the host+GPU to pass metadata to the display, while also reading the displays brightness capabilities to tone map properly.
duning kruger retard
HDR video is a specific set of new standards relating to the way color and brightness is encoded, such as perceptual quantization, wide color gamuts and greater bit depths.
It's an 8bit sRGB PNG. The whole thing is SDR.
>the sdr pic looks much better
>Thinks he can see HDR on an sdr screen through an sdr image
Fucking drooling troglodyte anon
I was making an observation on the image presented, but whatever, keep breathing that copium, sweaty.
You literally don't have the hardware to see HDR stupid, it's like saying you know what 1080p looked like on a 480p screen.
It is possible to make a reasonable SDR vs HDR comparison by capturing an SDR display and an HDR display using the same camera settings and tonemapping the comparison properly to SDR. You won't know what HDR actually looks like, but you'll know the difference between them. I don't think OP is quite that clever, but in theory that could be what the image in OP does.
No it's as fake as those 1ms vs 5ms monitor ads. Until any fucking anon has seen (irl, actually used for days) better than the 100+in $100k lg oled they released, don't even even claim shit because they're reading this on their 1080p Walmart va trash. I have seen the peak and it's honestly fucking glorious.
The camera data from an SDR and HDR monitor side-by-side is not "fake". No, you don't get to see what it actually looks like IRL, and you may or may not understand the subtleties of the tonemapping used, but as a pure comparison, it is not "fake", nor is it meaningless (if done right).
Yea you're right, didn't read through properly by reflex because of all the other stupid responses, but unfortunately that's not what the ad is, it's just some jumbo boosted sat/contrast. You're right though, if it's through a camera, it can be reasonably done, but even then, due to the limited DR of the best cams today (~13.5 stops), still makes it impossible to properly show. Reasonable maybe, but it's a completely different experience when seeing actual HDR. Light seems to have volume as it works through the scene, etc., stuff you can't just get by reading a spec sheet.
But yea mb, it's just the op pic is literally the 1ms vs 5ms bullshit.
when a good consumer HDR implementation exists
i do not know what iso ft hdr or exposure is. i will use the phone camera. i will use imagemagick and gimp. i will never learn photography jargon. i will destroy all camera devices
my macbook supports hdr at peak 1600nits but i only use it to watch anime at sdr 720p
Isn't HDR about RANGE? Everything is bright on the right, it doesn't show dark and black together well, thats actually better portrayed on the left.
Do you also compare which loudspeakers are best by listening to recordings of them on your phone?
Why would a server need HDR?
>year of the linux desktop 2022
>why would a server need HDR
Use ayymd and enjoy your over-saturated videos.
Wayland may eventually get a HDR support, lets hope its good like in macOS and not like the windows implementation. Once it does i will get a hdr capable monitor and switch to wayland. Hopefully in the next few years.
wayland lacks even basic color management which is a shame. I know most people don't have a colorimeter but in my mind wayland can only be a laptop display protocol as it stands since they typically don't have hardware controls for rgb anyway.
right looks like shit. everything mushes into whiteness right here because whole thing saturated. it's like you boosted saturation to a 100 on your shitty 1366x768p tv
because it's not HDR
it's simply a retarded way to get nobrain consoomers to buy shit, one of those shitty chinese ads on taobao or bangood something
Yeah proper HDR requires monitors that are capable of displaying incredibly bright pixels, AND artists and content creators that don't blow their load with the extra luminescence range on >muh vibracy
Here's an SDR image that vaguely captures the range of brightnesses real life environments produce.
When Wayland trannies finally implement it. Coming never ever in Xorg though sadly.
Linux can do HDR since last year if you have a driver with VK_EXT_hdr_metadata support (AMDVLK).
It's sadly still a buggy mess anon, not stable on all systems.
>SDR content viewed in SDR > SDR content viewed in HDR > HDR content viewed in HDR > HDR content viewed in SDR
HDR is cancer and complete and utter regression.
Delusional, morons like you would've said color tv was backwards when b/w was the standard option.
You're an idiot. A standard like HDR was inevitable given technological advancement and the limitations of SDR.
>turn on HDR
>everything just becomes bright and washed out
>blacks turn into light grey
>adjust HDR monitor settings but everything still looks like shit
How do I fix this? I've looked online but nothing works.
It's just a limitation of modern technology and implementations of the standard unfortunately. I would recommended watching your HDR media tonemapped to 10 bit SDR in the meantime.
Anon, I don't think you quite understand how broken it looks on my machine
if thats windows, its because windows has a dogshit sdr to hdr conversion so everything that isnt hdr will look awful
play something that is actually hdr and it will look good if you've got a real hdr monitor/tv
Wait so I have to dig through the Windows settings every single time I want to watch something that's HDR?
What's the fucking point?
The point is HDR sucks in both Linux and windows.
Best thing is to get an Nvidia shield TV pro and leave all your HDR content for that dedicated device.
>hdr sucks in linux
>get a device that runs google/linux
He must've meant gnu/linux
Linux as a kernel supports HDR, but only locked down Android boxes have a high level of HDR support baked in. Not only do these devices natively support DRM laden formats and various open and proprietary HDR formats and proprietary surround audio formats, they also have the best pre-configured out of the box color handling, tone mapping, and just ease of use due to being purpose built media playback devices.
>but only locked down Android boxes have a high level of HDR support baked in
What about lineageos?
HDR in GNU/Linux is non-existent, so doesn't suck.
on win 11 theres a shortcut to turn it on
Keep HDR disabled unless you're actually using it to watch HDR content.
>black is 0,0,0
>white is 255,255,255
there it is, I solved all problems
based linux preventing me from being flashbanged by images
You VILL be blinded by the nits goy
ITT: YouDon'tNeedThat™ cope
HDR won't be good until we stop making LCDs.
I'm a lurker but im forced to comment here cuz there is so much stupidity a human being can handle:
1.- the image on the post is obviously not hdr you can't se a hdr image in a system/display
2.- about over saturated image when using hdr... if you are so stupid to think you can get hdr image from a non hdr 8bit source you obviously gonna see over-saturated image cuz thee dev that made that nonsense seem to be as stupid as you; never play sdr madia using HDR effect mode in tv or software for it will olly fuck up the image. HDR has nothing to do with gamma or color saturation in fact a well calibrated oled HDR/DolbyVision lg display l(like the one I own) when playing Dolby Vision or HDR[10/+] media is even less saturated in the case of dolby vision cuz ppl usually has to much saturation in their TV settings and dolby vision beside not encoding 8bits of colour does it at 10 or 12 has metadata that automaticly calibrates your TV acording to the movies master and display (yes thats why DV TV are so expensive the have to have the specs of the panel to match them with the directions post-production Monitor). And that is that HDR/DV is all about more RANGE or DEPH of colors not brighter colors, is used to in dark sacenes were ther is little difference that not apears boxes like huge pixel, that in the eve is perceived as better color but more bright colors are usually due that HDR panels on tvs are also more bright (nits) and and have more color display features, also hdr media gets advantage of that features, but is not what is all about. Without HDR you can play supposed HDR videos (wich are not) on youtube and se brighter colors, than using native tv HDR capable player.
Also for even have true HDR colors not only the movie has to be encoded at 10 or 12 bit also the cameras used to fil mit has to have CMOS sensors physically capable of capturing such colors and logically/technically engoding it at 10 or 12 both of depth, you can't imagine how mane hollywood bullshit movies on are released on HDR blurays and are filmed on sdr or even analog cameras (well technically is possible to get hdr image from analog celluloid if digital-iced wich 10 or 12 cmos on the postproduction process, but yet haven seen one).
The rage forget me to tell maybe one thing everyone seems missing and what the OP was asking. RedHat/Fedora devs have promised to release full HDR support for linux in 2022/23. But that's that, they "promised".
Even more stupid than stupidity itself is non educate stupids, here is an intelligent article and youtube video about HDR and why we don't have it yet https://www.phoronix.com/news/AMD-2022-Linux-HDR-Display-Hard , and windows hdr support and suposed hdr panel on desktop monitors are a joke/lie so believe me windows user don't have HDR support eider, is cuz of that commercial/marketing lie that ppl here associates HDR with with over-saturated more bright image, cuz don't get me started on gaming HDR... kids these days are stupid as fuck wasting his parents hard earned money on lies.