[Cin] Adobe Premiere (2021) and HDR
Georgy Salnikov
sge at nmr.nioch.nsc.ru
Thu Apr 17 14:52:15 CEST 2025
On Thu, 17 Apr 2025, Andrea paz wrote:
> unnecessary intervention. In an HDR image, the white value above 1.0
> (which we see as homogeneous white = 1.0, however) leads to detailed
> gray values that are no longer homogeneous, thus reconstructing the
> content present in the white.
Andrea, please read carefully these published standards which AndrewR
already mentioned recently, I repeat the references here once again:
https://pub.smpte.org/pub/st2094-10/st2094-10-2021.pdf
https://pub.smpte.org/pub/st2094-40/st2094-40-2020.pdf
Pay attention, it is clearly stated: RGB values SHALL be in range [0.0, 1.0]
(with precision of 0.00001). Anything else is illegal.
What white values above 1.0 do you mean after that all ?????
A hint: floating point value presentation, in contrast to integer ones,
defines two properties: a range width (for example, -1000000 - +1000000, or
even -10^38 - +10^38), and a precision (for example, 7 decimal digits
typical of 32-bit float or 17 digits typical of 64-bit double precision).
There may exist also 'fixed-point' presentations, for example, where we
present a value in form of 8-bit unsigned byte: 0.0 is byte 0, 1.0 is byte
255, then it still has a range 0.0-1.0, and precision 8 bit, or 2 to 3
decimal digits. If we present the same in 10-bit form, then 0.0 will be 0,
1.0 will be 1023, the precision will be slightly higher than 3 decimal
digits.
What do you mean under 'your' high dynamic range? A width? A precision? Or
simply a combination of capital letters H, D, and R without any definite
meaning?
> What to do in CinGG if we are dealing with HDR media? If we have an
> HDR monitor I don't know, in fact if anyone has one, that would be
> useful information. If we have an SDR monitor all we can do is tone
> mapping and bring everything back to SDR. The trouble is that in CinGG
If 'HDR' is to be related to digital precision, then I'd say, any true SVGA
CRT monitor is a HDR monitor because SVGA interface propagates analog (not
digital) signal which potentially can have any precision (actually
determined by the bit depth of the DAC of video card which will generate the
analog SVGA signal). Same for other kinds of analog video interfaces.
But not so for modern LCD monitors attached to SVGA: they first convert
analog SVGA signal to internal digital form, after that everything depends
on their internal math.
> How HDR values relate to color spaces, I just cannot understand.
I tend to think, any image or video whose data have precision higher than 8
bit, is called a HDR image.
I am feeling one more aspect of misunderstanding, in a discussion about
'true HDR monitors' meaning 'true calibrated monitors'. First of all, any
monitor can be calibrated, independently on its bit precision and even on
analog interface. Second, any monitor with a digital interface actually IS
CALIBRATED! Any digital monitor technically must have DACs converting bits
into some millivolts, or milliamperes, etc. which finally drive light diodes
or what is in such monitors inside. If such a monitor 'needs no calibration'
means nothing else except the necessary calibration is hardwired in its
firmware. You can rely on the assumption that it 'needs no calibration', or
you can still attach a calibrator and test the quality of its firmware LUT,
if the monitor's firmware calibration data are really good, you will get
almost 1:1 color curves.
_______________________________________________________________________________
Georgy Salnikov
NMR Group
Novosibirsk Institute of Organic Chemistry
Lavrentjeva, 9, 630090 Novosibirsk, Russia
Phone +7-383-3307864
Email sge at nmr.nioch.nsc.ru
_______________________________________________________________________________
More information about the Cin
mailing list