Discussion:
nVidia GeForce & deep colour support in Linux
Milan Knížek
2014-08-27 16:43:17 UTC
Permalink
Hello,

sorry for an off-topic question: does anybody have a working setup in
Linux like:

x nVidia GeForce & Display Port & native 10-bit depth LCD

x nVidia GeForce & Display Port & native 8-bit + FRC LCD panel [1]

[1] http://www.tftcentral.co.uk/reviews/dell_u2713h.htm
or http://www.prad.de/new/monitore/test/2013/test-asus-pa279q.html


Based on earlier announcement [2], nvidia binary driver for Linux
supports 30-bit depth (RGB) in X for some time already. I succeed
turning it on and got some glitches / colour artefacts as described on
oyranos blog [3].

[2] http://www.nvidia.com/object/linux-display-ia32-295.20-driver.html
[3] http://www.oyranos.org/2014/05/image-editing-with-30-bit-monitors/


However, the 1024-step ramp remained displayed on my U2713H with 8-bit
gradation only (in Krita with OpenGL backend on, which is one of the few
apps supposedly supporting deep colour output).
Also dispcal with ColorMunki Photo reports 8-bit precision of video-card
LUT only.

I have access neither to Quadro card nor to native LCD to do more tests.

So, has anybody made GeForce to output Deep Colour in Linux or is it a
futile effort? (I have been googling and asking for a few weeks now w/o
success.)

Regards,
Milan
--
http://milan-knizek.net/
About linux and photography (Czech only)
O linuxu a fotografování
Graeme Gill
2014-08-27 23:40:55 UTC
Permalink
Milan Knížek wrote:

Hi,
Post by Milan Knížek
Also dispcal with ColorMunki Photo reports 8-bit precision of video-card
LUT only.
Dispcal may or may not cope gracefully with higher than 8 bpc
framebuffer depth - some of the the OS API's were missing
an allowance for that when I coded it (ie. MSWin, and the
latest OS X has some strange behavior in this area).
On X11 the depth of the Default Visual is used, since dispwin
creates a window that inherits this from the root window.

The display depth testing is done using the VideoLUT, on the assumption
that the VideoLUT depth is equal to or often greater than the
framebuffer depth (very odd if it is not), and the aim is to test
the calibration precision possible, the video path to the display and
the display capability.

Note that a lot of cheaper LCD displays use an 6 bit panel with dithering
to give 8 bits, while higher quality displays use an 8 bit panel with dithering
to give 10 bits effective, and that you need to use VGA, an (uncommon) dual DVI
connection or Display Port to hope to get more than 8 bits to the display itself.

Graeme Gill.
Chris Lilley
2014-08-29 12:52:54 UTC
Permalink
Hello Milan,
Post by Milan Knížek
does anybody have a working setup in
x nVidia GeForce & Display Port & native 10-bit depth LCD
x nVidia GeForce & Display Port & native 8-bit + FRC LCD panel [1]
This isn't the reply you were hoping for; I too have a set of high bit
depth hardware that is totally non-working under Linux. So I'm
interested to hear positive responses.

My system is Dell Precision M6700 laptop with Nvidia Quadro K3000M GPU
(native 10 bit per component) with internal DisplayPort cable (not
LVDS) going to wide gamut native 10 bit per component (12 bit
internal) PremierColor IPS screen.

http://www.dell.com/ed/business/p/precision-m6700/pd
http://www.notebookcheck.net/NVIDIA-Quadro-K3000M.76896.0.html
http://www.nvidia.com/content/PDF/product-comparison/Product-Comparison-Quadro-mobile-series.pdf



Its the first PC machine I have had in maybe 15 years that isn't
dual-boot Windows and Linux.

Mosly the Linux Live CDs fail to work; sometimes they work, I do the
install, and Linux boots to a black screen. I suspect the inability to
provide a 24bit visual is the reason, together with the 'nouveau'
driver.

I gave up a while ago, but if I have another crack at it I might try
installing with an external HDMI monitor (thus, restricted to 8bpc)
then installing the native NVIDIA driver, then rebooting with no
external monitor.
Post by Milan Knížek
[1] http://www.tftcentral.co.uk/reviews/dell_u2713h.htm
or http://www.prad.de/new/monitore/test/2013/test-asus-pa279q.html
Based on earlier announcement [2], nvidia binary driver for Linux
supports 30-bit depth (RGB) in X for some time already. I succeed
turning it on and got some glitches / colour artefacts as described on
oyranos blog [3].
[2] http://www.nvidia.com/object/linux-display-ia32-295.20-driver.html
[3] http://www.oyranos.org/2014/05/image-editing-with-30-bit-monitors/
So you got further than I did, but the dismal state of support for
greater than 8bpc under Linux is unfortunately not surprising.
--
Best regards,
Chris mailto:chris-***@public.gmane.org
Marwan Daar
2014-08-30 19:52:23 UTC
Permalink
Not sure if this is helpful, relevant, or even meaningful, but my
understanding is that regular geforce cards only support 10 bit in a
directx environment, whereas you need the quadro for 10 bit open gl
support (although it's disturbing that Chris is having issues despite
his quadro card).

Marwan
Post by Milan Knížek
Hello,
sorry for an off-topic question: does anybody have a working setup in
x nVidia GeForce & Display Port & native 10-bit depth LCD
x nVidia GeForce & Display Port & native 8-bit + FRC LCD panel [1]
[1] http://www.tftcentral.co.uk/reviews/dell_u2713h.htm
or http://www.prad.de/new/monitore/test/2013/test-asus-pa279q.html
Based on earlier announcement [2], nvidia binary driver for Linux
supports 30-bit depth (RGB) in X for some time already. I succeed
turning it on and got some glitches / colour artefacts as described on
oyranos blog [3].
[2] http://www.nvidia.com/object/linux-display-ia32-295.20-driver.html
[3] http://www.oyranos.org/2014/05/image-editing-with-30-bit-monitors/
However, the 1024-step ramp remained displayed on my U2713H with 8-bit
gradation only (in Krita with OpenGL backend on, which is one of the few
apps supposedly supporting deep colour output).
Also dispcal with ColorMunki Photo reports 8-bit precision of video-card
LUT only.
I have access neither to Quadro card nor to native LCD to do more tests.
So, has anybody made GeForce to output Deep Colour in Linux or is it a
futile effort? (I have been googling and asking for a few weeks now w/o
success.)
Regards,
Milan
János, Tóth F.
2014-08-30 20:29:55 UTC
Permalink
As much as I know, FirePro and Quadro drivers offer nonstandard
proprietary ways to display 10-bit images inside windowed application
over a "legacy" 8-bit desktop manager (thus the user isn't forced into
DirectX or OpenGL fullscreen exclusive rendering modes and the
developers don't need to create their own specialized user interface
for that unique display mode).

The Radeon cards from the HDMI / DisplayPort and DirectX 11.x era are
proved to work in 10 bit DirectX fullscreen display modes.
I am not sure about the Geforce cards but it should be a matter of
driver support (I bet the hardware is capable to do it since every
DX11 cards should be).
Post by Marwan Daar
Not sure if this is helpful, relevant, or even meaningful, but my
understanding is that regular geforce cards only support 10 bit in a directx
environment, whereas you need the quadro for 10 bit open gl support
(although it's disturbing that Chris is having issues despite his quadro
card).
Marwan
Post by Milan Knížek
Hello,
sorry for an off-topic question: does anybody have a working setup in
x nVidia GeForce & Display Port & native 10-bit depth LCD
x nVidia GeForce & Display Port & native 8-bit + FRC LCD panel [1]
[1] http://www.tftcentral.co.uk/reviews/dell_u2713h.htm
or http://www.prad.de/new/monitore/test/2013/test-asus-pa279q.html
Based on earlier announcement [2], nvidia binary driver for Linux
supports 30-bit depth (RGB) in X for some time already. I succeed
turning it on and got some glitches / colour artefacts as described on
oyranos blog [3].
[2] http://www.nvidia.com/object/linux-display-ia32-295.20-driver.html
[3] http://www.oyranos.org/2014/05/image-editing-with-30-bit-monitors/
However, the 1024-step ramp remained displayed on my U2713H with 8-bit
gradation only (in Krita with OpenGL backend on, which is one of the few
apps supposedly supporting deep colour output).
Also dispcal with ColorMunki Photo reports 8-bit precision of video-card
LUT only.
I have access neither to Quadro card nor to native LCD to do more tests.
So, has anybody made GeForce to output Deep Colour in Linux or is it a
futile effort? (I have been googling and asking for a few weeks now w/o
success.)
Regards,
Milan
Kai-Uwe Behrmann
2014-08-31 20:00:53 UTC
Permalink
Post by János, Tóth F.
As much as I know, FirePro and Quadro drivers offer nonstandard
proprietary ways to display 10-bit images inside windowed application
over a "legacy" 8-bit desktop manager (thus the user isn't forced into
DirectX or OpenGL fullscreen exclusive rendering modes and the
developers don't need to create their own specialized user interface
for that unique display mode).
That sounds like a workaround. The most difficult part is to give no choice of how to drive the monitor (24 or 30-bit). With my quattro setup + nvidia. That is certainly not the case. Without a 30-bit visual there is no 10-bit output. However I have no idea what AMD does in their drivers.
Post by János, Tóth F.
The Radeon cards from the HDMI / DisplayPort and DirectX 11.x era are
proved to work in 10 bit DirectX fullscreen display modes.
I am not sure about the Geforce cards but it should be a matter of
driver support (I bet the hardware is capable to do it since every
DX11 cards should be).
Post by Marwan Daar
Not sure if this is helpful, relevant, or even meaningful, but my
understanding is that regular geforce cards only support 10 bit in a
directx
Post by Marwan Daar
environment, whereas you need the quadro for 10 bit open gl support
Opengl works fine with 30-bit visuals under xorg, be it Krita, Compiz, KWin or ICC Examin.
xwininfo -root | grep Depth
Shows me Depth: 30. That means any application, which is capable to encode 10-bit bit per plane can display 30-bit. It is merely a question to find a toolkit in support of that. Using xlib directly should enable applications for high bit depth, which is mostly abstracted away by linux graphics API's these days. The more easy way is using OpenGL.

kind regards
Kai-Uwe
--
Kai-Uwe Behrmann
www.behrmann.name
Milan Knížek
2014-09-01 19:52:03 UTC
Permalink
Post by Kai-Uwe Behrmann
Post by Marwan Daar
Post by Marwan Daar
Not sure if this is helpful, relevant, or even meaningful, but my
understanding is that regular geforce cards only support 10 bit in a
directx
Post by Marwan Daar
environment, whereas you need the quadro for 10 bit open gl support
I have tried to get some info from devtalk Linux forums of nVidia. (59
views, no response yet.)

The README mentions few bits about the requirements:

This driver release supports X screens with screen depths of 30 bits
per pixel (10 bits per color component). This provides about 1 billion
possible colors, allowing for higher color precision and smoother
gradients. When displaying a depth 30 image, the color data may be
dithered to lower bit depths, depending on the capabilities of the
display device and how it is connected to the GPU. Some devices
connected via analog VGA or DisplayPort can display the full 10 bit
range of colors. Devices connected via DVI or HDMI, as well as laptop
internal panels connected via LVDS, will be dithered to 8 or 6 bits per
pixel.

To work reliably, depth 30 requires X.Org 7.3 or higher and pixman
0.11.6 or higher. In addition to the above software requirements,
many X applications and toolkits do not understand depth 30 visuals as
of this writing. Some programs may work correctly, some may work but
display incorrect colors, and some may simply fail to run. In
particular, many OpenGL applications request 8 bits of alpha when
searching for FBConfigs. Since depth 30 visuals have only 2 bits of
alpha, no suitable FBConfigs will be found and such applications will
fail to start.
Post by Kai-Uwe Behrmann
Opengl works fine with 30-bit visuals under xorg, be it Krita, Compiz,
KWin or ICC Examin. xwininfo -root | grep Depth Shows me Depth: 30.
That means any application, which is capable to encode 10-bit bit per
plane can display 30-bit. It is merely a question to find a toolkit in
support of that. Using xlib directly should enable applications for
high bit depth, which is mostly abstracted away by linux graphics API's
I can run X server also with 30-bit depth (X.org log confirms it) with
32 bpp framebuffer (so only 2 bits are for transparency). I have found
the same info in Xorg.log posted on the internet forums by some Quadro
card user.

I also checked with Dell U2713H support community forum that the LCD
actually has 10-bit input and works with MS Windows + Quadro / Firepro +
Photoshop.
Post by Kai-Uwe Behrmann
these days. The more easy way is using OpenGL.
Is there anything special for Krita to display in 30-bit depth other
than to choose "Enable OpenGL" in Preferences / Display and load a
16-bit encoded image? I used Compiz as WM.

regards,
Milan
j***@public.gmane.org
2014-09-01 21:46:00 UTC
Permalink
G,$. 
Lllo

  Original Message  
From: Milan Knížek
Sent: Monday, September 1, 2014 3:52 PM
To: argyllcms-***@public.gmane.org
Reply To: argyllcms-***@public.gmane.org
Subject: [argyllcms] Re: nVidia GeForce & deep colour support in Linux
Post by Kai-Uwe Behrmann
Post by Marwan Daar
Post by Marwan Daar
Not sure if this is helpful, relevant, or even meaningful, but my
understanding is that regular geforce cards only support 10 bit in a
directx
Post by Marwan Daar
environment, whereas you need the quadro for 10 bit open gl support
I have tried to get some info from devtalk Linux forums of nVidia. (59
views, no response yet.)

The README mentions few bits about the requirements:

This driver release supports X screens with screen depths of 30 bits
per pixel (10 bits per color component). This provides about 1 billion
possible colors, allowing for higher color precision and smoother
gradients. When displaying a depth 30 image, the color data may be
dithered to lower bit depths, depending on the capabilities of the
display device and how it is connected to the GPU. Some devices
connected via analog VGA or DisplayPort can display the full 10 bit
range of colors. Devices connected via DVI or HDMI, as well as laptop
internal panels connected via LVDS, will be dithered to 8 or 6 bits per
pixel.

To work reliably, depth 30 requires X.Org 7.3 or higher and pixman
0.11.6 or higher. In addition to the above software requirements,
many X applications and toolkits do not understand depth 30 visuals as
of this writing. Some programs may work correctly, some may work but
display incorrect colors, and some may simply fail to run. In
particular, many OpenGL applications request 8 bits of alpha when
searching for FBConfigs. Since depth 30 visuals have only 2 bits of
alpha, no suitable FBConfigs will be found and such applications will
fail to start.
Post by Kai-Uwe Behrmann
Opengl works fine with 30-bit visuals under xorg, be it Krita, Compiz,
KWin or ICC Examin. xwininfo -root | grep Depth Shows me Depth: 30.
That means any application, which is capable to encode 10-bit bit per
plane can display 30-bit. It is merely a question to find a toolkit in
support of that. Using xlib directly should enable applications for
high bit depth, which is mostly abstracted away by linux graphics API's
I can run X server also with 30-bit depth (X.org log confirms it) with
32 bpp framebuffer (so only 2 bits are for transparency). I have found
the same info in Xorg.log posted on the internet forums by some Quadro
card user.

I also checked with Dell U2713H support community forum that the LCD
actually has 10-bit input and works with MS Windows + Quadro / Firepro +
Photoshop.
Post by Kai-Uwe Behrmann
these days. The more easy way is using OpenGL.
Is there anything special for Krita to display in 30-bit depth other
than to choose "Enable OpenGL" in Preferences / Display and load a
16-bit encoded image? I used Compiz as WM.

regards,
Milan
Kai-Uwe Behrmann
2014-09-02 04:44:50 UTC
Permalink
Post by Milan Knížek
Post by Kai-Uwe Behrmann
these days. The more easy way is using OpenGL.
Is there anything special for Krita to display in 30-bit depth other
than to choose "Enable OpenGL" in Preferences / Display and load a
16-bit encoded image? I used Compiz as WM.
Krita -> [top menu] Settings -> Configure Krita -> Display [left tab] ->
[x] Enable OpenGL

+ a proper gradient test image e.g.
Loading Image...
from http://www.oyranos.org/wiki/index.php?title=Test_Images
should reveal the smothness of gradients. Compare with zooming 100% in
Krita and e.g. FF .

kind regards
Kai-Uwe
Milan Knížek
2014-09-09 18:33:28 UTC
Permalink
Post by Kai-Uwe Behrmann
Post by Milan Knížek
Post by Kai-Uwe Behrmann
these days. The more easy way is using OpenGL.
Is there anything special for Krita to display in 30-bit depth other
than to choose "Enable OpenGL" in Preferences / Display and load a
16-bit encoded image? I used Compiz as WM.
Krita -> [top menu] Settings -> Configure Krita -> Display [left tab] ->
[x] Enable OpenGL
+ a proper gradient test image e.g.
http://www.oyranos.org/images/16bit-gradient.png
from http://www.oyranos.org/wiki/index.php?title=Test_Images
should reveal the smothness of gradients. Compare with zooming 100% in
Krita and e.g. FF .
Thanks for the hint. No success so far.

Can you pls do some extra stuff for me so that I can compare?

Run "xwininfo" in terminal and click on the Krita window. There should
be some output like:
...
Depth: 30
Visual: 0x21
Visual Class: TrueColor
Border width: 0
Class: InputOutput
Colormap: 0x20 (installed)
...

Then run also glxinfo piped through grep to filter your actual "Visual"
ID (which is 21 in my case):
$ glxinfo -v | grep -A7 "ID: 21"
Visual ID: 21 depth=30 class=TrueColor, type=(none)
bufferSize=30 level=0 renderType=rgba doubleBuffer=1 stereo=0
rgba: redSize=10 greenSize=10 blueSize=10 alphaSize=0 float=N sRGB=Y
auxBuffers=4 depthSize=24 stencilSize=0
accum: redSize=16 greenSize=16 blueSize=16 alphaSize=16
multiSample=0 multiSampleBuffers=0
visualCaveat=None
Opaque.

Thanks!
Milan
Kai-Uwe Behrmann
2014-09-09 18:45:35 UTC
Permalink
Post by Milan Knížek
Can you pls do some extra stuff for me so that I can compare?
Run "xwininfo" in terminal and click on the Krita window. There should
...
Depth: 30
Visual: 0x21
Visual Class: TrueColor
Border width: 0
Class: InputOutput
Colormap: 0x20 (installed)
...
Then run also glxinfo piped through grep to filter your actual "Visual"
$ glxinfo -v | grep -A7 "ID: 21"
Visual ID: 21 depth=30 class=TrueColor, type=(none)
bufferSize=30 level=0 renderType=rgba doubleBuffer=1 stereo=0
rgba: redSize=10 greenSize=10 blueSize=10 alphaSize=0 float=N sRGB=Y
auxBuffers=4 depthSize=24 stencilSize=0
accum: redSize=16 greenSize=16 blueSize=16 alphaSize=16
multiSample=0 multiSampleBuffers=0
visualCaveat=None
Opaque.
The outputs here are all identical.

Loading...