I have Windows "Color Quality" set to "High (32-bit)". Why can't I display all the colors in a 16-bit image?
Last updated on:
24 Jun 2008
MaxIm DL Version 5
Most monitors are only 8-bit capable, meaning it can display 256 levels or shades for each color red, green and blue. These colors are then combined, to "emulate" the various different colors that you see on your screen for a total of 16 million colors (256x256x256=16777216). When you set Windows to "Highest (32 bit)" Color Quality, it simply means that Windows has 16 million colors to work with. Of course the next question is usually, "well if Windows can use the 16 million colors, why can't it display a 16-bit image on the monitor?", and the answer is: a 16-bit image, refers to the images' bit depth. So a 8-bit image has 256 shades or levels, and a 16-bit image has 65536 shades or levels of each color red, blue, and green. As mentioned earlier, a typical computer monitor is only 8-bits, and is capable of displaying only 256 shades or levels. So we use the "Stretch" command to bring the 16-bit image into an 8-bit range that the monitor is able to display, without loosing any of the image data.
The "Stretch" command does not actually alter the image, it only temporarily changes how the image is displayed on screen so we can see it better.
Copyright 2010, Diffraction Limited