-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SudoVDA - Peak Brightness Configuration #164
Comments
It doesn't read from client. The information is not available to the host from Moonlight and Artemis right now, so the value is hardcoded. But peak brightness shouldn't affect the streamed quality. I just answered about HDR on reddit but I'll post here again:
|
The reason I'm bringing this up is that I was trying to use the HDR Calibration Tool for Windows on the Virtual Display, and they all topped out at around 300-350 nits, no matter the client. The calibration got rid of blown out highlights and fixed the saturation, but my guess was that the display values topped out at 300 because of the hardcoded EDID. May be a concidence, though? |
Just tired with an EDID with 1600nits peak brightness and it gave no different result than the 300nit one on my client device. Manual calibration turns out with the exact same values, and the performance is still bad on those bad devices. The displayed "nits" in Windows settings is not relevant to your client device and you can ignore that. But people might want the value to look great, I'll consider replace with the high peak numbered one in the next release. |
You can try it out yourself if you don't believe me: This is not a release build so it's still recommended to use the packed one with the installer. |
Also don't forget to turn on HDR in your client settings, it's right in the first section below resolution and above audio settings. |
Thank you for providing the build and the detailed explanation! I appreciate the opportunity to test this.
I can confirm that in the 308nits build, the calibration pattern disappears at the 308nits setting in the calibration tool, and at 1650nits in the 1650nits build.
I agree that the hardcoded 308nits feels a bit low. I think setting a value in the 800–1500 nits range would be more typical for HDR displays and better aligned with common HDR mastering practices. An interesting side effect of the 1650nits build is that it makes hitting the 100% mark in the Windows HDR Calibration Tool easier, especially on smaller screens. My Assumptions About HDR StreamingHere’s my current understanding of how HDR streaming works in this setup. Please feel free to correct or expand:
After calibrating the virtual display using the Windows HDR Calibration Tool, the results for HDR streaming on my devices have been very good. The calibration process was straightforward, and the HDR output looks excellent on my devices, including my S23FE. 🤷 Open Questions and ObservationsI’ve been wondering how features like AutoHDR or NVIDIA RTX-HDR interact with EDID-provided capabilities:
If any of my AutoHDR/RTX-HDR assumptions are true, then it might actually be safer/better to set the hardcoded value too high, rather than too low. Let me know if there’s anything I can help with or if my assumptions are off! |
I just wanted to chime in with my own experience. I've been an HDR dork since 2018. My understanding is that HDR content is supposed to be "mastered" for 1000 nits brightness as a sort of de facto standard, and anything else is up to individual movie studios or game devs. This might be where a lot of the nonsense comes from, because there is no formalized engineering standard like there is with color gamut (BT.709, BT.2020 etc). My experience trying to get HDR streaming to work properly on Apollo with the "automatic" SudoVDA adapter matches that of Bernhard's. Since there's no "fixed" adapter, I can't use CRU to tweak the values manually, as Apollo creates a brand new adapter every time, which shows up as a new discrete device in CRU, separate from all the previous virtual displays. If the EDID doesn't advertise at least 1,000 nits, I can't get certain games which have had proper or near-enough-proper HDR implementations to render HDR brightness correctly. One example is Cyberpunk 2077. At 308 nits, my tweaks in the games HDR calibrator have no noticeable effect on the preview images that it cycles through, when I crank beyond 300. When I screw around with CRU on Sunshine (which as you know uses a fixed virtual adapter) and force a different nits reading, the HDR calibrator in Cyberpunk 2077 begins working as expected, and I can notice changes easily between 100-1600 (where my TV peaks). Basically, it's a similar issue to what Bernhard has with the Windows HDR calibrator, but inside of a game instead. COS, I think your experience might somehow be related to your specific device configurations, how they handle HDR metadata, or other environmental factors. I don't think both the OP and I having the same problem is coincidental. Not sure why you aren't having a problem, but I think it's definitely worth looking into more. I absolutely do understand your perspective on HDR not being worth using/not recommended currently, but for those of us with a long-time experience with it who have grown accustomed to having our eyeballs blown out whenever an in-game/in-movie sunrise scene or similar occurs and feel weird without it, it's a "nice-to-have." My recommendation would be to have a hardcoded nits of either 1,000 or 1,600. I would prefer 1,600 as I am biased given my TV's capability (with power-saving features turned off it's actually much higher, around 2,400, but 1,600 is where my eyes stop caring about the differences and is probably the high end of how games or movies are ever mastered, since a lot of cheaper displays peak at much lower values), but 1,000 is fine for the vast majority of HDR content. I would leave that choice up to you though. |
That's strange actually, I always get around 2200 out of the calibration tool on multiple devices no matter what the peak brightness is and the image is always super dim when HDR is enabled, except for iOS devices and my Xbox connected to my TV. Can you make sure you have checked the "Enable HDR(Experimental)" box on your client? I think I get similar peak brightness issues only when the HDR option on the client isn't enabled. The metadata of peak brightness shouldn't be encoded with the stream as described in the first reply, and your questions are actually the added up complexity why I don't recommend HDR. |
I agree that it's strange, I'm not an HDR expert by any means, but I can only describe what I am experiencing versus what I experience in-game when a higher peak brightness capability is being advertised by the adapter. The more I read about it on various user forums, both game streaming and calibration forums, posts from CRU users etc, the more confused I get. I was about to throw my hands up before I saw Bernhard's post and your replies tonight and how he had a better experience at 1600. And yes, HDR is enabled on my client (standard Moonlight). I should note that when I installed your 1600nits drivers back from earlier in the thread, I did have to re-calibrate both the desktop (Windows HDR Calibrator) and Cyberpunks settings because a bunch of stuff suddenly looked washed out. So there's something happening for sure. But I can't tell you where it's happening on my system or if it's worth your time to really investigate that deeply beyond changing the hardcoded EDID to one with a higher peak rating. As I insinuated earlier, I'm pretty much in agreement with you on how confusingly-implemented HDR is, I just know that I like having it enabled in most games that support it (can't remember the last game where it looked subjectively worse to me than SDR, though I'm sure there have been some), and for whatever reason my system and specific setup doesn't like the lower brightness values. edit: Don't know if it matters, but I figured I should let you know I'm on Windows 11 24H2, RTX 4080 OC with latest drivers. edit 2: Oh, and the HDR on/off selector doesn't stay on between sessions, for the added adapter in the windows display settings - but only after I switched to your 1600nits drivers. It was fine before. Maybe I installed them wrong? I just pointed the driver update tool in device manager at them (after copying them to the driver folder in the Apollo program files folder). I originally thought the wash-out effect was because of that, but it wasn't. edit 3: BAH, there's something fishy going on. Killed and restarted session and now the windows calibrator doesn't show any difference in the test pattern when I move the slider back and forth. Pretty sure I'm doing something wrong at this point! Gonna uninstall SudoVDA, reinstall Apollo and go from there. |
Ok, yeah, well... Apparently whatever I did to "reset" my operating environment, cleaning drivers, uninstalling, rebooting, reinstalling etc got my system matched up with you, COS, because now I'm at the default 308 nits advertised and it's behaving as you believe it should (i.e. it doesn't matter). I was able to calibrate normally on Windows and in-game and it behaved as expected. I'm throwing my hands up, officially; not gonna bother trying to figure out what was different with Sunshine or when I was using CRU or whatever the hell else. That said, I do think OP has a point about maybe setting the default EDID to something that advertises at least 1,000 nits, as this will put anyone else with similar issues at ease and won't lead them down the wrong diagnostic path if they are having strange HDR configuration issues. Especially since, as you say, it truly should not matter. And maybe updating the Readme/install notes to clarify for new users that the advertised nits are not gonna be their issue if they are having HDR calibration problems, maybe with a link to this thread. |
@prefix1647 you've made a really good point here. I just checked and you seem to be right: Windows loses the virtual display's calibration profile every time I start a new session (from the same client) and I have to re-do the HDR Calibration again (not when disconnecting, but after ending the session).
I always get the exact hardcoded peak brightness value out of the calibration tool (300 for 308build, 1650 for 1650build). HDR is definitely enabled and I get a superb image out of it on my Samsung clients (S23FE and Galaxy Tab S6). There's a noticable difference between HDR off and HDR On. Not only in games but also on HDR enabled YouTube videos (they look pretty much the same in HDR when playing in native YouTube app on the phone/tablet and through the HDR enabled and calibrated Stream, = gorgeous). I also just cycled through a few games that let you adjust the nits target (UE5 games) and there's a noticable difference when cycling through the nits target as @prefix1647 mentioned. @ClassicOldSong what are your client devices apart from the iPhone and Xbox where HDR looks crap for you? If you're streaming to a TV/monitor - maybe theres some dynamic tonemap setting screwing with the output? |
This is a bug plaguing me for long, other users report that Windows mess up with the profiles with other clients(not only loosing them) and I don't know how that could happen. All devices I have that has HDR advertised but look bad are OnePlus Pad Pro/Lenovo Y700 2025/OnePlus 13, while Y700 looks OK-ish but still dimmer than SDR and color accuracy falls off. All these devices look really stunning when streaming SDR. They do play HDR videos quite good with YouTube. I really don't have spare money to buy Samsung devices to test, as at half of the price I can get devices with superior specs like the OnePlus ones... |
Just tried, OnePlus13 actually look not that bad, but still dimmer than SDR. And streaming HDR actually added a lot more color artifacts to the picture, that is also noticeable on my other two devices. Edit: the artifact is also there on iOS devices, just less noticeable. Playing the same video directly on the device doesn't have the problem. |
I might also add: My host is Windows 11, 24H2 and latest nvidia drivers (RTX4090) with the nvidia app - maybe that makes a difference. I have also activated "HDR Streaming" in Windows' HDR settings. A good YouTube video test for me also was this one, from 2:00-2:20 - the difference between SDR and HDR is pretty stark for this scene. Especially when it climbs the tree and during the lightning strikes. Without Windows HDR calibration, the highlights of this scene are blown out (it's a mess of blue), whilst after the calibration, you can see all kinds of shades of blue. https://youtu.be/DbMrGNOyioU?si=5hqKqMNfVldSHIoz With HDR on and calibrated, it looks exactly the same on the HDR stream as it does on the native Android YT app |
I'm using RTX4080Super with 24H2 and the latest Nvidia driver. I think we have similar configurations for host so it make things a little bit easier to compare. |
I just looked through the upstream VDD repo and there seems to have been some movement regarding HW Cursor support and HDR,HDR10+,WideColor Gamut support and custom EDID handling since your fork. I'm sure you already checked, but maybe there are some lessons that could be learned from what happened upstream lately. |
HW Cursor is already supported in SudoVDA, HDR10+ is not meaningful for streaming as there's no encoder can handle it, same for WideColor Gamut. Custom EDID will add complexity to the "Just works" experience. Those additions are not meaningful for streaming and actually I don't think there're any use cases those options will put any benefit to real world usage, as it is a virtual display that still somehow needs to be displayed to a physical display. Just checked the video frame by frame, there is a difference for color tolerance but the highlight doesn't exceed SDR. Maybe my devices can handle SDR too well in this scenario and the color accuracy on SDR mode actually beats that in HDR. Above: iPhone13 ProMax playing through YouTube app natively |
Also, VDDs don't do anything to the picture itself, it's just reporting a bunch of configurations of a "monitor" to the host system. So no matter which driver is in use, the color issue has nothing to do with the driver. |
I agree on all of it. I mainly mentioned it in case there's something in there that could give a clue about the forgotten device issue or in case it makes sense to rebase your changes on the latest upstream cause of other features/fixes that you implemented but could be taken from upstream (therefore maybe reducing your maintenance overhead). |
Actually no idea about how to fix the issue. MTT's implementation doesn't add/remove virtual displays this often, and maybe Microsoft doesn't think there're people using the API like this, so it's up to Microsoft to fix the problem... Parsec's virtual display driver doesn't have HDR at all, and it's using the same EDID that contains the same serial number for each virtual display it creates, that's making display configurations can never be managed by Windows correctly. SudoVDA by far solves most of the problems, but still got backstabbed by this Windows issue.
Doesn't make sense either, although the repo started as a fork but I rewrote the driver completely half way as there're some issues with their original implementation. Their latest version also implemented some "dynamic" configurations but their implementation still doesn't fit the need here and has other problems, either from programming's aspect or user's aspect. |
Unfortunately, it looks like the virtual adapter that Apollo creates has randomly gone back to having a weird color/brightness issue. On the client device (Samsung 65" QLED TV), bright colors in-game look blown out (small text often unreadable), in-game HDR calibrator doesn't have any noticeable effect, and Windows 11 HDR Calibrator doesn't work properly either (minimum luminance slider only makes large chunk changes, no gradience, and only shows pure black at 0.000 which I'm not sure is correct for my TV but might be, and max luminance calibrator is an all-white square no matter what I set). HDR is enabled for the display adapter in the Windows display settings, all other settings look normal. Default driver from your last "official" release (308 nits advertised). I didn't change anything between the last Apollo session a few days ago and now, so it's something with the new virtual adapter it created this time, or with the way Windows reacts to it. Definitely some weird interplay between it and Windows. Like you mentioned before, not sure it's anything wrong with SudoVDA, Apollo, or Sunshine. Probably all on Microsoft with the way their display adapter management API works. Just wanted to report back on it (and to confirm I wasn't hallucinating in my original report). |
This is what I'm getting from most Android devices all the time when enabled HDR, and the same thing happens on macOS clients. The biggest problem about HDR related issues/discussions even arguments is how inconsistent it is. When it's good, it's really good, but those who experienced good might have not experienced bad. And due to how Apollo/Sunshine works, it's likely that with Apollo it's their first time experiencing HDR streaming, and they immediately get an unexpected result(due to HDR itself), they start to think it's Apollo's problem that has poor support for HDR... It's complicated to explain why HDR streaming has all these issues, but let's go back to the first reply I made in this issue: an HDR video stream should only be handled/tone mapped by your client device. Calibration shouldn't be needed (at least from host) and Windows doesn't need to care about "Peak Brightness" of the virtual display. But the fact now is, calibration in Windows does have an effect on the captured video itself, that means the color is tone mapped TWICE. Paired with Windows's buggy implementation of color profile management, all problems are amplified even more...... |
I 100% agree this is all on Windows implementation and HDR itself being a mess. Windows HDR calibration and internal tonemapping does not affect games as far as I can tell, or at least not games that are set to Fullscreen mode - but I do the calibration because I sometimes browse websites from my TV using Apollo/Sunshine (I can control the interface with my iPhone - I stream to Moonlight on an Apple TV 4K and the iPhone TV remote app has a very large touch-area for mouse-pointer manipulation, and the iPhones keyboard can be used for text input as well), and the HDR calibration makes a difference. When I'm playing a game, the only device that should be tonemapping anything is my TV. Although it's a bit older, my TV was very expensive when I bought it (equivalent 2024 model is $1,500 cheaper lol) and its display/calibration options reflect that. Cheap TV's don't let you calibrate anything - I recently looked at a $400 "HDR 4K" TV that didn't even let you change RGB balance, much less anything else. My TV has a very detailed color and HDR calibration options which I have carefully gone over using rtings and other sources of information to ensure the TV is as neutral as possible and isn't changing its tonemapping based on a change in input/source/media type. I also did the 20-point tuning a while back, using a calibration tool I borrowed from an A/V nerd friend (20-point tunings have to be done on a per-TV basis since the precise values can change based on manufacturing inconsistencies between screen panels). Overall tonemapping is basically always set to "Standard" mode (unfortunately the TV hides the numerical details), and I calibrated the rest based on that. Any and all options that would dynamically change the TVs color calibration or HDR mode based on "content detection" or "source change," or otherwise alter the source image, have been disabled. This is fine since I only ever stream HDR HEVC (BT.2020 color space) from my PC. So, I'm quite sure my TV isn't changing randomly. As you also suspect, I'm leaning towards Windows randomly being a pisser... as usual... and doing something to the image that the TV's calibrated tonemapping causes to become ugly. But I can appreciate that HDR as a technology is an absolute mess, so I'm not asking you to look into this or try to fix anything. I understand it's not your problem to fix. I mostly wanted to document my experience in detail for other people who may come here trying to find out why their colors look wrong or blown out, so they can understand that it's not something that can be fixed per se, and they just need to tinker until they are satisfied, or stick with SDR. |
Peak brightness adjusted to 1671 nits: https://github.com/ClassicOldSong/Apollo/releases/tag/v0.2.7 |
Is there a way to change SudoVDA's peak brightness configuration for the adapter in Windows?
I imagine remembering it reading the values correctly from the client in the past, but I'm not sure anymore. I just tried 2 different HDR capable clients (Galaxy Tab S6 and S23FE) and both show as 308 nits peak brightness in Windows.
So.. is there a way to override the config/edid manually / make it read the client values?
The text was updated successfully, but these errors were encountered: