Go Back   Steve's Digicams Forums > Digicam Help > General Discussion

Reply
 
Thread Tools Search this Thread
Old Oct 28, 2005, 9:51 AM   #1
Senior Member
 
Caelum's Avatar
 
Join Date: Sep 2005
Posts: 1,030
Default

Hi, my first digital camera, the 1Mp Kodak DC210+(Plus), is now hitting seven years old. I still use it, it takes good pics, it's stupid simple and has a useful wider than normal lens for indoor shots and agood 8" macro (I know it's slow and limited butplease spare me bashing this camera, it has a soft spot in my heart).

Thepoint is that the camera is suddenly starting to have stuck pixels. I say stuck pixels because they appear as bright pixels in all shots regardless of exposure. First one, then three, within the past few months. I tried taking a very bright photo to see if I could shock the stuck pixels back into action, it seems that it might perhaps have unstuck one, but at least two remain. I suspect morewill come.

So my question is, what has been other people's experience with their CCDs starting to fail/degrade after years of use? I guess we all realize that digital cameras are in no way an investment, but it makes me question plunking down a lot of cash for a high end one if it has a limited life span, other than a technological one.
Caelum is offline   Reply With Quote
Sponsored Links
Old Oct 28, 2005, 10:09 AM   #2
Administrator
 
Join Date: Jun 2003
Location: Savannah, GA (USA)
Posts: 22,378
Default

Most newer sensors are going to have some bad pixels (probably more so withsmaller and higher resolution sensors).

You just don't see them when cameras come from the factory, because they're already mapped out.

It's also not uncommon for a CCD to develop more bad pixels as a camera ages.

Basically,a manufacturer is keeping a bad pixel table in memory (EEPROM), and it interpolates around them during processing (taking values from adjacent pixels and replacing them).

With older camera models, when you had a bad pixel (either dead or stuck on) the manufacturer's typically ran a service program to update the bad pixel map. Most consumers thought the CCD was being replaced, when the camera's image processing is just interpolating to replace the bad ones. ;-)

You can find software to update some models yourself now (for a number of consumer cameras made by Nikon, Olympus and others). You can even find software to update the bad pixel table in some DSLR models (for example, the Nikon D100). This software is not supported by the manufacturers (hackers figured out how to call the hidden routines in the cameras).

Here is one example that can work with some of the Nikon and Olympus consumer models:

http://e2500.narod.ru/ccd_defect_e.htm

With some newer cameras, the manufacturers started finding a way to let the camera do it without the need for separate service software. Olympus started it first (AFAIK), beginning with their Olympus E10 Model (this 4MP 2/3" Olympus desginedCCD was very bad for getting stuck pixels, so they came out with a firmware upgrade designed to let the user call the service routine to check for them and map them out). Many newer models from Olympus also have a menu choice to remap bad pixels (even though they're not using Olympus designed sensors anymore).

Konica-Minolta apparently put in an Automatic routine to check for bad pixels and map them out monthly in some of their newer models. I've never seen it officially confirmed. But, more than one user has reported that it works on some of their higher end models (and you can fool the camera into running the bad pixel remap by setting the date up on the camera). Neat feature.

Yes, pixels can go bad during CCD aging. But, I don't think there is any "rule of thumb" on how many (if any) may go bad.

I've got an older Nikon Coolpix 950 that doesn't have any (that I can see anyway). Yet, I've got a newer Konica KD-510z that has one "borderline pixel" (sometimes hot, depending on the shutter speed, lighting and camera temperature).

You can also find software designed to detect and map out bad pixels during Post Processing. Here is an example:

http://www.tawbaware.com/pixelzap.htm


JimC is offline   Reply With Quote
Old Oct 28, 2005, 10:49 AM   #3
Senior Member
 
Caelum's Avatar
 
Join Date: Sep 2005
Posts: 1,030
Default

Thanks for all the info, much appreciated.I'm thinkingthis automatic interpolation work around/mapping outcould be worrisome since it ultimately impacts the image quality withoutour knowledge, but then again with a very high pixel count, perhaps not by much. I guess since theDC210+has a low pixelcount (1Mp),stuck pixelsshow much more thanthey would otherwise. But also I guess these high pixel count CCDs haven't been around long enough to really know their expected lifespan.

It also just crossed my mind that Ihad an old Sony video camera that just stopped working and it apparently was due to leaky capacitors that died, so, in this case the CCD wasn't it's demise. I guess nothing lasts forever, even for good old reliable 35mm cameras, perhaps one day we won't find any more film.:O
Caelum is offline   Reply With Quote
Old Oct 28, 2005, 11:48 AM   #4
Administrator
 
Join Date: Jun 2003
Location: Savannah, GA (USA)
Posts: 22,378
Default

Interpolation algorithms are very complex. Even without taking bad or stuck pixels into consideration,each photosite is only sensitive to one of three colors (or 4 in the case of some sensors). The exception is some sensors made by Foveon (which is a why a 3 Megapixel DSLR Model like the Sigma SD10using one of their sensorscan often rival 6 to 8 Megapixel Models in detail captured, depending on the subject type).

Manufacturers use a type of dye on the photosites so that only one color passes for each one (known as the Color Filter Array). In most array types, you'll have twice as many photosites sensitive only to green versus red and blue(probably because the human eye is more senstive to green). Then, demosaicalgorithms determine the final color and brightnessfor each pixel in the image by looking at the values of multiple photosites from the sensors during processing of the data from the sensor.

It's a complex process. Here is an interesting white paper discussing some of the methods commonly used:

http://www.ece.gatech.edu/research/l.../bahadir05.pdf

You'renot losing as much detail as you think when a manufacturer takes values from adjacent pixels (which they're really doing anyway to give you the final result). Chances are, the bad pixel map is happening after the raw image processing (so it could degrade the image a little more that way, and raw images may not have them mapped out). But, with millions of pixels, I doubt anyone would notice a few bad ones (unless they are all right together).

Again, they're probably doing it anyway (mapping out bad pixels at the factory before you even know that they are there). So, it's best to have a way for the user to do it themselves for the long run (IMHO anyway).
JimC is offline   Reply With Quote
Old Oct 28, 2005, 12:12 PM   #5
Senior Member
 
Join Date: Sep 2005
Posts: 1,093
Default

JimC wrote:
Quote:
Even without taking bad or stuck pixels into consideration,each photosite is only sensitive to one of three colors (or 4 in the case of some sensors).
That was a very interesting post. Just one question -- for those sensors that use four colors, what are those colors?
tclune is offline   Reply With Quote
Old Oct 28, 2005, 12:36 PM   #6
Administrator
 
Join Date: Jun 2003
Location: Savannah, GA (USA)
Posts: 22,378
Default

tclune wrote:
Quote:
JimC wrote:
Quote:
Even without taking bad or stuck pixels into consideration,each photosite is only sensitive to one of three colors (or 4 in the case of some sensors).
That was a very interesting post. Just one question -- for those sensors that use four colors, what are those colors?
Red, Green, Blue and Emerald are the colors usedin aColor Filter Array type introduced by Sony (they use itin their DSC-F828 model, but other cameramanufacturers using a Sony 8MP2/3" Sensor don't get this array).

Here is a press release about this CFA:

http://www.steves-digicams.com/pr/so...lorCCD_pr.html


JimC is offline   Reply With Quote
Old Oct 28, 2005, 2:14 PM   #7
E.T
Senior Member
 
E.T's Avatar
 
Join Date: Jul 2003
Posts: 921
Default

JimC wrote:
Quote:
But, with millions of pixels, I doubt anyone would notice a few bad ones (unless they are all right together).
I would say that amount of stuck pixels is way, way over few in all consumer products.

In astronomical photographing one of the reasons for so different prices of different CCD-cameras (with equal specs) is that sensors are classified basing to amount of stuck pixels after manufacturing, only very small part of sensors made have just few/none stuck pixels so they're price is very high compared to lower graded sensors which where produced in exactly same batch along those sensors with no stuck pixels.
(I wouldn't wonder if price difference is in class of 10$ versus 1000$)
E.T is offline   Reply With Quote
Old Oct 28, 2005, 2:50 PM   #8
Administrator
 
Join Date: Jun 2003
Location: Savannah, GA (USA)
Posts: 22,378
Default

E.T wrote:
Quote:
JimC wrote:
Quote:
But, with millions of pixels, I doubt anyone would notice a few bad ones (unless they are all right together).
I would say that amount of stuck pixels is way, way over few in all consumer products.

In astronomical photographing one of the reasons for so different prices of different CCD-cameras (with equal specs) is that sensors are classified basing to amount of stuck pixels after manufacturing, only very small part of sensors made have just few/none stuck pixels so they're price is very high compared to lower graded sensors which where produced in exactly same batch along those sensors with no stuck pixels.
(I wouldn't wonder if price difference is in class of 10$ versus 1000$)
It depends on what you want to classify as "stuck", too (temperature, light levels, exposure time, how bright the pixels are). ;-)

You're dealing with longer exposures for astronomy, with cooling systems available for the sensors, since hot pixels are going to be a problem with longer exposures and higher CCD temperatures.

So, a test for stuck pixels at70 degrees and 1/40 second may yield totally different results compared to a test at 100 degrees and 1/2 second. Ditto for where you set thresholds for brightness with these pixels (at what signal level do you determine it's out of tolerance). The same for dead pixels (at what temperature, exposure time and light level is the signal generated not adequate)

I don't know what the camera manufacturers consider to be an acceptable level of bad pixels in newer models. From what I've seen, the smaller and higher resolution sensors seem to have more of a problem with hot pixels at anything other thanthe shortestexposure times (of course, what's hot is a matter of opinion, too).

But, most newer models have "dark frame subtraction" noise reduction systems that kick in at around 1/2 - 1 second or longer exposures (replacinghot pixel locations in theactual image based on their locations in the black frame photo taken with the same exposure tiime tofind potential hot pixels).

So, they've probably got more pixels mapped out that would show up at "typical" exposure times, too. But, because of the higher resolution, more of them may not be any more noticeable, provided the percentage overall is not increasing.

JimC is offline   Reply With Quote
 
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off



All times are GMT -5. The time now is 7:36 AM.