Quato iColor Display 3.7 User Manual page 9

Intelli proof displays
Table of Contents

Advertisement

and yellow are the primary colors of the subtractive
system. So far the theory (again). In practice, a sub-
tractive cyan looks much different than an additive
cyan. And an additive red looks even more different
than a subtractive red.
However, to make sure that a display shows the right
white and the behaviour of all three channels at any
luminance is correct, a display needs to be adjusted.
Ways of Calibration
The color depth of a graphic signal from the compu-
ter to the display is 8bit - means 16.7 million colors
or 245 shades per channel. If this signal is changed,
the number of colors will be lower the bigger the
change will be. Therefore the best way is to avoid any
corrections of the signal. But as we have learned, a
display needs to be adjusted to show colors correct-
ly. That means the corrections have to be either done
in the displays or on the computer.
If the signal is corrected on the computer (in the
Look Up Table [LUT] of the graphics card), the signal
to the display does no longer contain the full 8bit
bandwidth. This will result in a loss of detail and
choppy gradiants. Up to 20% per channel or 50% of
all colors will be lost.
A better way is to correct as much as possible inside
the display and let only the remaining deviations be
corrected by the graphic card LUT. The loss can be
reduced to 10% per channel or 25% of all the colors.
However, the best way is to do all corrections inside
the display and also correct the non-linear behaviour
there. This will help to keep the full bandwidth. To do
so, a dedicated interface is needed to communicate
with the display. This approach is called hardware
calibration, while the first two ways are called soft-
ware calibration.
The hardware-calibrated Quato displays either use
USB or DDCI to communicate with the software on
the computer and adjust the displays internally with
up to 16bit precision. The high-precision LUT inside
the display is then used to correct all the deviations,
and the signal is downscaled to 10bit when it passes
the internal display driver circuits. The scaling can
be compared with high-end photography where
images are kept in 16bit mode until all corrections
have been done. Then, a scaling to 8bit shows no
Subtractive system (white) and additive system (black)
LUT correction on the graphic card (left) versus correction inside
the display LUT (right)
Calibration
USB Interface
Processor with
16 bit per Channel
Embedded RISC
MCU
Feedback Channel
Gamma Correction
Whitepoint
Image
Color Correction
Enhancement
Processor with
16 bit Red LUT
Frame Modulation
for 30bit
16 bit Green LUT
16 bit Blue LUT
Luminance-
Stabilisation
Luminance
Stabilisation
Temperature Sensor
DVI
45min
Schematics of the process of a hardware calibration.
16 bit Color-
management
Routines
Uniformity Sensor
Uniformity Circuit
with 140 Measurements
Luminance Sensor
9

Hide quick links:

Advertisement

Table of Contents
loading

Table of Contents