Steve's Digicams Forums

Steve's Digicams Forums (
-   General Discussion (
-   -   How digital cameras detect color (

BG Davis Aug 2, 2015 7:58 PM

How digital cameras detect color
I understand that digital camera sensors do not capture color, and that color is added via filters (most often a Bayer array). However, the explanations of how a non-color (white light or grayscale) image captured by the sensor is translated into a color image always seem to leave something out. There seems to be a leap of logic.
Obviously, the non-color image will have brighter and darker areas, corresponding to greater and lesser energy (intensity of light) detected. I still don’t understand how putting a color filter over this is going to create an accurate color image. It seems like trying to colorize a grayscale image; the results would seem to be guesswork. Talking about interpolation, etc., really does not give the answer to this question, any more than interpolation can provide the original colors in an old BW photo. You can interpolate all you want, but you have no way of knowing what the original colors really were.
For example, take a brighter area in the grayscale image captured by the sensor. How does the software know what color this area was in the original scene?
(This question became very real for me a few years ago, when I took a photo of a blood-red sunrise, in a blood-red sky flecked with blood-tinged clouds, and my Canon camera decided that all that red sky really should be blue, and the clouds tinged with a tasteful pink. Turned a really unusual image into something totally pedestrian. And many sunset pictures render subtle pinks and violets as garish yellows.)
I will be eternally grateful for a comprehensible explanation that fully explains the process of adding color to the image captured by the sensor.

TCav Aug 3, 2015 5:41 AM

Individual photoreceptors of the image sensor detect light through the individual red, green, and blue filters of the Bayer Filter.

Color levels for each pixel of an image are derived from the values from each photoreceptor for the color of that photoreceptor's filter, and averaged from the values from adjacent photoreceptors for the other colors.

For cameras that use Bayer Filters (There are some that don't, like Sigma's cameras that use Foveon image sensors, and the Leica M Monochrom which captures only B&W images by using a conventional image sensor without the Bayer filter.) B&W images are created from the color images, most typically by averaging the color values for each color for each pixel, and applying that value to all the colors of a pixel. Thus, if a pixel has values of 64, 128, and 192 for red, green, and blue, respectively, resulting in a dark cyan, then those values would be replaced with 128 for all three colors, resulting in a medium gray.

I hope this helps.

wanaclick Aug 3, 2015 5:49 AM

Elementary school physics taught us that what we see as white light is composed of all colours.
A rainbow splits these colours because it gets diffracted into individual colours of the spectrum.
A disk containing VIBGOYER colours when spun appears whie which is reverse of the rainbow.

Now that is what the colour filters do in a camera.

what your Canon did had to do with white balance settings and the jpg processing engine and the scene mode if any you used.
If you had shot RAW you would have gotten accurate colours which then you could have tweaked to your taste.

VTphotog Aug 3, 2015 7:06 AM

The color filters do not add color to a grayscale image - the filters subtract color from a color image. With a Bayer filter array, the pixels behind the red filters see only an intensity of light, which has had all but the red color filtered out. Same with the blue and green. This creates three pictures of the light levels in the scene. Software recombines these into a single RGB image.
Your sunset colors turned out incorrect, either because, as wanaclick, mentions, the white balance was incorrect, or because the scene was overexposed, and the camera interpreted the brightest part as white, when it was not, causing the rest of the colors to be skewed.

Edit: I should have mentioned that the camera reads the brightness from the green pixels. This is because there are both twice as many, and because green is closest to the human eye's maximum sensitivity. If there is no, or little green in the subject at the highest light intensities, you will have skewed colors and brightness problems.

TCav Aug 3, 2015 9:29 AM

Image sensors don't capture B&W images. They capture the intensities of red, green and blue light, and match those intensities to the color of the filter over the individual photoreceptor. It uses those red, green and blue intensities to construct a color image.

It makes a B&W image from the color image by averaging the red, green and blue values for each pixel.

As for your sunset, I think that what you saw as a red sky was actually a violet sky.

The visible spectrum is as follows:
  • red - 620–750 nm
  • orange - 590–620 nm
  • yellow - 570–590 nm
  • green - 495–570 nm
  • blue - 450–495 nm
  • violet - 380–450 nm
With the filters used in a Bayer Filter, violet light passes through the blue filter, but not the red or green filters, so the camera presents the color as blue because it doesn't know the difference. This situation is unusual, but no extraordinary, and has been discussed here before. See:

markbrown14 Dec 5, 2015 2:52 AM

How digital cameras detect color
I also want to know that how Digital Cameras Detect Color

All times are GMT -5. The time now is 6:16 AM.