Xpra: Ticket #1309: 10-bit color support in opengl client

Split from #909.

The best explanation of the changes required can be found in https://www.nvidia.com/docs/IO/40049/TB-04701-001_v02_new.pdf, see 30-Bit Visual on Linux.

We'll need to tell the server we want 10-bit colour, maybe advertise a new YUV or RGB upload mode.



Wed, 08 Feb 2017 11:08:01 GMT - Antoine Martin: attachment set

gl_check.py output


Wed, 08 Feb 2017 12:36:07 GMT - Antoine Martin: owner changed

With r15015, running xpra/client/gl/gl_check.py against a 30-bit display I get attachment/ticket/1309/gl_check.txt, which shows:

* blue-size                       : 10
* red-size                        : 10
* green-size                      : 10
* depth                           : 30

So we can detect support for 30-bit color and 10-bit per channel. And r15018 handles 30-bit modes with native 30-bit upload: "r210" == "GL_UNSIGNED_INT_2_10_10_10_REV". r15019 fixes swapped colour red and blue (oops), r15026 allows us to prefer high bit depth "r210" plain rgb encoding if the client is using 10-bit depth rendering. (jpeg and video encodings will still be used for lossy packets). r15027 shows the bit depth on session info (normal bit depth is 24): shows the bit depth on session info

We could probably handle R210 the same way (as "GL_UNSIGNED_INT_2_10_10_10") but since I don't have hardware to test.. this is not supported.

@afarr: FYI, we can handle high color depth displays (only tested on Linux).


Thu, 09 Feb 2017 08:30:19 GMT - Antoine Martin: attachment set

shows the bit depth on session info


Thu, 16 Feb 2017 17:16:58 GMT - Antoine Martin:

PS: r15094 fixes opengl rendering which broke because our hacked pygtkglext library is missing the "get_depth" method, OSX clients will not support high bit depths until this is fixed: #1443


Sun, 19 Feb 2017 06:37:13 GMT - Antoine Martin: milestone changed


Mon, 20 Feb 2017 09:10:00 GMT - Antoine Martin:

See new wiki page: wiki/ImageDepth


Tue, 20 Jun 2017 20:00:35 GMT - J. Max Mena:

Realistically, we won't be able to test this until we get proper hardware for it. And even then, I have no idea what said proper hardware will be.

@antoine - some input as to what we should be testing with would be nice, but I wouldn't hold my breath on us actually getting said equipment if it involves asking for new hardware.


Tue, 20 Jun 2017 20:56:03 GMT - Antoine Martin: owner changed

@antoine - some input as to what we should be testing with would be nice, but I wouldn't hold my breath on us actually getting said equipment if it involves asking for new hardware.

You may already have all you need:

More nvidia info here: 10-bit per color support on NVIDIA Geforce GPUs

Actually verifying that you are rendering at 10-bit per colour is a bit harder:


Edit: AMD’s 10-bit Video Output Technology seems to indicate that 10-bit color requires a "firepro" workstation card


Wed, 12 Jul 2017 07:40:10 GMT - Antoine Martin:

Updates and fixes:

The test application is ready in #1553, but it's not really easy to use because it requires opengl... virtualgl can't handle the "r210" pixel format, and the software gl renderer doesn't support it either. So in order to test, I had to run the xpra server against my main desktop with the nvidia driver configured at 10 bpc. Then connect a client... and the only client I had available for testing was a windows 7 system, and ms windows doesn't do 10 bpc with the consumer cards, so I had to swap cards. Then the monitor it was connected to didn't handle 10 bpc, so I had to swap that. Then the cables were too short. Then I had to make fixes (see this ticket and many other fixes yesterday - bugs you only hit with --use-display for example...) TLDR: hard to test!


Wed, 12 Jul 2017 12:09:18 GMT - Antoine Martin:

r16303: the "pixel-depth" option can now be used to force the opengl client to use deep color (use any value higher than 30) - even if the display doesn't claim to render deep color. ie: running the server with --pixel-depth=30 -d compress, and a linux opengl client with --pixel-depth=30 --opengl=yes, I see:

compress:   0.1ms for  499x316  pixels at    0,0    for wid=1     using  rgb24 with ratio   1.6% \
    (  615KB to     9KB), sequence     5, client_options={'lz4': 1, 'rgb_format': 'r210'}

Note the "r210" rgb format. Same result if the client is running on a 30-bit display with --pixel-depth=0 (the default) Whereas if the client runs on a 24-bit display, or if we force disable deep color with --pixel-depth=24 then we see:

compress:   1.4ms for  499x316  pixels at    0,0    for wid=1     using  rgb24 with ratio   1.3% \
    (  615KB to     7KB), sequence     3, client_options={'lz4': 1, 'rgb_format': 'RGB'}

Remaining issues:


Thu, 13 Jul 2017 15:26:54 GMT - Antoine Martin:

Updates:

With these changes, it is now much easier to:

For macos, see also #1443


Thu, 20 Jul 2017 12:37:01 GMT - Antoine Martin: status changed; resolution set

Tested on win32 (no luck) and Linux (OK) as part of #1553, for macos testing: #1443. Closing.


Fri, 28 Jul 2017 10:09:47 GMT - Antoine Martin:

opengl applications running through virtualgl currently require this patch: ticket:1577#comment:2


Mon, 29 Jul 2019 15:17:11 GMT - Antoine Martin:

NVIDIA @ SIGGRAPH 2019: NV to Enable 30-bit OpenGL Support on GeForce/Titan Cards: At long last, NVIDIA is dropping the requirement to use a Quadro card to get 30-bit (10bpc) color support on OpenGL applications; the company will finally be extending that feature to GeForce? and Titan cards as well.


Sat, 23 Jan 2021 05:20:44 GMT - migration script:

this ticket has been moved to: https://github.com/Xpra-org/xpra/issues/1309