Wednesday, November 4, 2009

Why monochrome cameras produce better images than color cameras

Let’s start by recalling how CCD and CMOS sensors work. I’ll spare you all the physics and just say that when a photon lands on the silicon it generates a charge. The charge is converted to a voltage and digitized as a grayscale value. In other words, the sensor has no knowledge about the wavelength of the incident light, it just knows how many photons arrived.

So how do we get color images? Simply by filtering the photons that each pixel receives. A
Bayer filter in front of the sensor sorts the incoming photons by wavelength, so some pixels detect only the green photons, while others detect the red and the remainder the blue.

Now with a Bayer filter, 25% of the pixels receive only blue light, and as the text in the images I posted yesterday is blue, that means we only have ¼ the number of pixels “seeing” those characters. Putting it simply, color cameras provide lower resolution than monochrome cameras.

But this isn’t the end of the story.

No comments: