https://github.com/CESNET/UltraGrid/discussions/203
10bit display with SDL/GL
#203
has this hint (may be not needed on Display port)
====
referred pdf says this:
"Figure 5: 10-bit display feature can be enabled by checking the “Enable 10-bit pixel format support” checkbox in the Catalyst
Control Center. Note that this feature is only available in workstation (ATI FireGL™) cards.
Once the 10-bit pixel format support is enabled the system will request a reboot and after that any 10-
bit aware OpenGL application will be displayed in 10-bits without any clamping. In the following, we
demonstrate how an application programmer can easily create a 10-bit aware OpenGL
[skip]
Creating a 10-bit Texture
The previous section highlighted several methods to choose a 10-bit pixel format. It is important to note
that once a 10-bit pixel format is chosen any smooth shading operation will immediately take advantage
of the increased bit depth. In other words, the user does not need to explicitly provide 10-bit input data
to benefit from the increased precision. This is due to the fact the internal processing of colors in the
graphics card is in floating-point precision with 10-bit (or 8-bit) conversion occurring only at the output
stage.
However, it is also possible to explicitly create and work with 10-bit textures as will be explained in this
section. 10-bit texture support is exposed to the user through a packed RGB10_A2 texture format,
which contains 10 bits for each of the color channels and 2 bits for the alpha component.
====
original issues also should contain test images