As you probably are aware, Iphone 12 can shoot in HDR HLG BT2020 Dolby Vision H.265 10bit planar.
The problem is that because of the tiny sensor, it doesn't have many stops, which in turns doesn't have many nits, however what Apple seems to be doing is very peculiar and requires a very dedicated matrix of linear transformation, hence this LUT.
The reason why I had to make yet another matrix is that I wasn't really pleased with what the tonemapping algorithms were producing: on the left hand side, Reinhard, in the middle, Hable, on the right hand side the proper HDR HLG BT2020 interpreting both the color curve and color matrix correctly:
https://i.imgur.com/nBNE3q5.png
they came straight from zscale of FFMpeg but similar results can be achieve in Avisynth with HDRTools.
After spending lots of time tweaking the parameters, I decided to make the LUT myself and this is what I've got:
It seems that Apple's interpretation of the HLG curve is not exactly based on the BBC specs as it has a black point that starts very high, way higher than it should. HLG is known to be an Hybrid curve which starts like a Linear BT2020 curve, so like SDR, but then changes and approaches its HDR nature, which is why it can be viewed both on SDR and HDR monitors. In Apple's implementation, however, this is NOT the case as the black starts way higher, as if it was a proper logarithmic curve. This creates problems as it's supposed to start lower and "confuses" tonemapping algorithms.
====
so, in this case custom LUT is way to go ... ?