-
Notifications
You must be signed in to change notification settings - Fork 177
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Slope Limit #69
base: master
Are you sure you want to change the base?
Slope Limit #69
Conversation
Implementation of Slope Limiting, to emulate the TRC handling of other CMMs on a per-transform basis
a mere formatting correction
Never, I’m afraid, unless many people argue for it. Back in 2016, Marti Maria said he wouldn’t want to merge it because
Unfortunately, I had no time left to pursue this further, so I left it at that. Since you reference the IINA thread: Yes, mpv’s color management is broken (at least last time I had time to look) for 2 reasons:
There are several huge threads about this topic, starting in February 2014 (mpv-player/mpv#534, mpv-player/mpv#2815, mpv-player/mpv#3002, mpv-player/mpv#4248, mdejong/MetalBT709Decoder#1), but what it all comes down to is this: We had found a solution to reproduce QuickTime Player’s color management (which is the de facto video color management standard since video color management was introduced by Apple) exactly in mpv, but unfortunately, it was dismissed later on. |
Yep, and that is exactly the core of the problem. From an ICC/colorimetric POV you cannot “interpret” PCS data differently in the input and the output conversions. I.e. if the input to mpv is BT.709 (which it mostly is), the output must be BT.709, too, or else you’ll get colorimetrically incorrect data. If you combine a BT.709 (gamma ≈ 1.96) input with a BT.1886 output (gamma ≈ 2.2), you’ll get a steeper video gamma, i.e. more contrast. People used to TV sets find that pleasing, but colorimetrically, it’s wrong. From a color management POV, converting to and from the PCS must never change the colors; keeping colors the same is the whole point of color management. Artfully combining hardware devices with different gamma curves to achieve a desired effect was something that was used in the ancient world of analog video where there was no way to digitally edit the video any way you liked, but that has no place in computer color management. It’s a bit like studio monitors and Hi-Fi loudspeakers in the audio world; consumers don’t care about precise sound reproduction but prefer the “effect” sound of HiFi speakers; it’s the same with TV sets. This is what we have argued about for years in the threads linked above. Apple seems still to be the only actor in the industry that gets colorimetrically exact video color management right; all the other apps, including VLC and mpv/IINA produce imagery with far too high contrast.
But the difference that distorts the imagery is the difference between BT.709 on the one hand and BT.1886/sRGB on the other. BT.1886 vs. sRGB isn’t the problem.
I don’t know the C9, but given the fact that’s it’s a TV set, it’s probably awful from a colorimetric POV.
How so? I don’t understand your point here.
BT.1886 is useless for ICC video color management.
A TV set that performs ICC color management? 🤔 Or do you just mean color handling? In the end, it all boils down to the question whether you prefer correct or pleasing colors. 😎 |
Whatever. It’s not 1.96 (BT.709) which it would have to be for correct color reproduction.
perfect OLED is an oxymoron. OLED is a consumer technology.
Of course, I meant keeping colors the same as far as possible. The nuances between 10 bit and 8 bit YCbCr are completely negligible relative to the heavy deviation produced by using a gamma of ≥ 2.2 instead of 1.96.
In the context of a computer, any kind of color transformation means ICC color management, or it won’t be consistent with other apps on this computer. If you’ll use your computer as a mere video player, you’ll be free to do whatever you want.
But Hollywood level means consumer level. Hollywood targets its products at consumers.
I don’t doubt the display is technically capable of producing correct colors. It most probably just doesn’t do it if it’s advertised as a TV display. TV displays reproduce colors incorrectly on purpose. We’ve had this discussion for years. This is not the right place for it; Slope Limiting is not a video specific technology. |
The different behavior of your two displays is a good example of what ICC color management is all about: abstraction from hardware. From an ICC POV, there is no “BT.1886” display. There are only displays with corresponding color profiles which take care that the displays (within their physical limits) reproduce the Lab/XYZ colors from the PCS (profile connection space) colorimetrically correct. So you can exchange all kinds of displays with vastly different physical behavior with no change at all in color reproduction. The physical gamma of the display does not matter at all, because the corresponding ICC profile compensates for it. This is vastly different from the video world that (contrary to digital imaging) already existed long before the computer age. Back then, to achieve a desired image manipulation, people had to „creatively combine“ physical devices with different physical characteristics. E.g. using the BT.709 signal of a camera (gamma ≈1.96) with a BT.1886 display (gamma ≈2.4) produced a desired contrast enhancement, but from an ICC POV it is simply wrong (because color must not change in the processing chain). This seems to be a source of endless confusion for people who are coming from the video world to ICC color management. The main point of ICC color management is interoperability/exchangeability. Different displays with different physical characteristics must not change the color reproduction. Neither must different software. So if you make a screenshot of a video in a video application with an OS provided utility, save it as a still image and open it in an image editor, the image editor must show the exact same colors that the video application shows. This is only possible if the video application does not use its own, isolated mechanism to generate the display data, but uses the same, system-wide ICC display profile as every other application on the computer. Now, the emotional issue in this context is that regular ICC display profiles typically do not provide for dim surround compensation because they assume bright surroundings during usage. For people used to TV sets, this makes the contrast too low although colorimetrically, it is correct. But the only place in ICC color management where a dim surround compensation could take place architectonically would be in a special ICC display profile that uses a color appearance model for dim surroundings adjustment. This way, all applications would be affected by the color appearance model adjustment in the same way and thereby would remain consistent in color reproduction with each other. |
Camera 1.96. That’s the source data. Whether a color appearance correction in the display output is required or not depends on the viewing conditions and is not know in advance, so it must not be taken into account during image processing. |
I think it should be 'Camera 1.0' I believe the camera should be treated as linear, going into the first transform. At least, that's what I do for camera profiles I create... |
Just to avoid misunderstandings: What do you refer to with camera profiles and first transform? ICC profiles and the transform from the camera data into the PCS (XYZ or Lab), i.e. the OETF? If so, XYZ and Lab are both linear, so you must take the 1.96 gamma of a camera with BT.709 output into account. |
I wrote a raw processor, and the cameras I work with deliver raw data largely in its original scene-linear relationship. In order to get this data into the ICC workflow, it needs an associated camera profile with a "neutral" tone curve. So, the first transform in my software is camera -> XYZ -> whatever, where XYZ is the PCS and 'whatever' is the destination space, ProPhoto, sRGB, etc. Now, in my software this can be in a distinct 'colorspace' tool stuck in the processing tool chain, but if there is none, it is done at output to display or an image file. If I were to make the camera profile anything other than linear, then the first part of the transform would do something to the raw data not representative of its true starting state, and the second part of the transform, say, to gamma 2.2 sRGB, would not have its intended effect. This way of thinking may not line up with others, but I get decent renders from it... |
Well, I let LIttleCMS and cmsDoTransform() do the transform. It could just be flipping a coin, pixel-by-pixel, for all I care. I just didn't get good results until I started assigning linear camera profiles to raw data prior to that initial transform. |
Well, maybe the camera raw data is linear (as opposed to its .mp4/whatever output in BT.709)? I don’t know. |
It wasn’t supposed to do that; I closed by accident and therefore immediately reopened.
Theoretically, yes. But funnily, endless trial & error to reproduce the color behavior of QuickTime Player exactly showed that Apple uses 1.96, not the BT.709 curve. 1.96 (with slope limit) reproduces the behavior of QuickTime Player and Final Cut Pro perfectly. Using BT.709 instead already changes the colors noticeably. So to get consistent colors on macOS , you’d need 1.96. |
They are, because Apple uses slope limiting.
Yes. And both approaches work in that regard, it’s just that the results are not identical.
But I do. We spent a huge amount of effort to reverse engineer what Apple does, and it’s 100% certain that Apple does it this way. (Which doesn’t mean you have to accept that, of course. But for color management to be consistent on macOS, you’d have to do it this way.) It’s all documented in mpv-player/mpv#534, but this thread is far too long to be readable.
Then googling was maybe too fast. 😉 Apple “officially” describes why it uses 1.96 in https://developer.apple.com/library/archive/technotes/tn2257/_index.html. Personally, I feel that this is just a handy rationalization. To get ICC color management consistent, they knew they’d have to use the BT.709 curve, but handling complex curves on iPhones of that time (2013) was too computationally expensive, so they preferred to replace the complex curve by the gamma equivalent 1.961. But to make that decision tolerable for video people, they came up with the justification provided in that document which has a completely different line of argumentation, but “by chance” ends up with gamma 1.961. |
Yes. They are clearly distinguishable visually.
What do you refer to with this? To the default ICC display profile on MacBooks? That might well be model specific.
TrueTone did not exist when we made these tests and I highly doubt that TrueTone is usable in connection with ICC color management, but I have not been able to test this so far, since I do not have a TrueTone capable Mac.
? The whole point of ICC display profiles is to abstract from the hardware, so OLED or not should not make any difference at all.
? BT.709 has no black point, only display profiles have. |
Um, yes, but what’s your point? |
It obviously does matter if you want to reproduce QuickTime’s behavior, which is the only way to stay color consistent on macOS (and for any camera with ProRes output, for that matter).
Yes and no. macOS uses pure gamma profiles for video (or at least did this when we performed our tests), but Apple’s ColorSync CMM always adds the linear part to the profile in the form of the slope limiting. That’s why slope limiting is so crucial.
Misunderstanding. I don’t doubt that calibration can achieve good quality; I’m talking of out-of-the-box TV sets. |
I cannot add this PR because mainly 3 reasons:
Anyway I think there is a clean way to do what you need without all those problems. Instead opening profiles with cmsOpenProfileFromFile, create your own open function that:
It is so easy. All is kept in memory, and then there are no problems on patents or the CMM. 100 lines or maybe less. For absolute intent, also this was discussed at ICC level. lcms just does what spec says. Use adaptation state to get legacy absolute colorimetric. |
Approximately when did this meeting take place? (Meaning, when can we expect to see the new behavior in Apple’s and Adobe’s products?) |
So why not just adapt the profiles in the way I described? You could just implement "openProfileForMovies" and everything will work as you wish. Note this would just return a "fixed" cmsHPROFILE handle, not to touch anything of the profile in disk. I really don't want to enter in this kind of discussions about what is best or not for a given system, the CMM is just a tool to do blindly what profile says. If profile is wrong for any reason, the root cause and the thing to fix is the profile, not the CMM. Regarding you complains about patents and the ICC, I could share same opinion but this is not the adequate place at all. I am no longer in the ICC and don't hold any software patent 😃 |
In 2013 or so. I wonder why they didn't... or maybe they did but only on V4 profiles? |
In this case I don’t think we can afford to wait till they do it. It’s not an exotic niche issue. It’s about watching videos with correct color. |
Yep. I suspect they did, but only when both profiles are V4. I have done a quick check on Catalina with AdobeRGB1998 (V2) and I can confirm ColorSync does some tweaking in the lower darks. Tonight will check Display P3 which is a true V4 profile.
Sure! I understand there is a legit need, and this is the reason I never closed the PR. Again, I think providing a customized open function would solve the issue in a clean way. Some pseudo-code:
Then you just open profiles with this function and you have slope limiting implemented in the correct way, not mixed with CLUT and other curves. |
I’m not sure whom you asked, but I would say yes.
If you’ll use bt.1886 or something else in the last step depends solely on your display and your viewing conditions. |
Not necessarily. It might be anything you set as display profile.
What do you mean by “done on GPU”? The tone response curves stored on the graphics card? These are always stored on the graphics card, and they correspond the display profile the user selects. If the user select BT.1886, it’s BT.1886, if the user selects P3 it’s P3 and BT.1886 comes nowhere into play.
I am not? |
Implementation of Slope Limiting, to emulate the TRC handling of other CMMs on a per-transform basis, as discussed in the Lcms-user mailing list.
Currently, the following emulation settings are known from tests performed on Mac OS X 10.9:
cmsCreateTransform()
andcmsCreateTransformTHR()
cmsFLAGS_SLOPE_LIMIT_32 cmsFLAGS_BLACKPOINTCOMPENSATION
cmsFLAGS_SLOPE_LIMIT_16
cmsFLAGS_SLOPE_LIMIT_16