Go Back   Steve's Digicams Forums > Digicam Help > General Discussion

Reply
 
Thread Tools Search this Thread
Old Aug 27, 2003, 9:39 AM   #11
Senior Member
 
Join Date: Mar 2003
Posts: 121
Default

I just thought of something. Shouldn't sensitivity be measured as something like photons per micometer, or something similar?

If this is true, then the small/High res CCD would win over the large/low res CCD.

One thing I have been wondering recently is about medium format. In medium format, the film is larger, thus the Dof is smaller, and the grain is smaller on the final print. But what about ISO? Does a 400 ISO in 35mm = 400 ISO in Medium Format?

We see in digital photography that that is not the case. Canon's 1DS's 400 ISO is probably equivalant to Canon's A70's 50 ISO, even though they both end up with the same light. One has done it by collecting light over a large area, the other by amplifing the signal the light creates in a small area.

More Questions! Ahhh! :roll:
Melboz99 is offline   Reply With Quote
Old Aug 27, 2003, 9:42 AM   #12
Senior Member
 
Join Date: Jul 2003
Posts: 115
Default

just want to say this is a very intresting thread to follow! as to the subject of the thread i have nothing to say about it yet.
Jayson is offline   Reply With Quote
Old Aug 27, 2003, 1:38 PM   #13
Senior Member
 
Join Date: Aug 2002
Posts: 2,162
Default

Forget ISO and sensitivity in f stop terms. These are only translations of what you are used to in film photography.

In any system where something small is amplified, additional unwanted electrical 'disturbances' we call noise will be added. This derives from the physics of semiconductors and the temperature of the devices. It has a lot to do with equations and Boltzmans Constant - but the physicists will explain that. Theoretically, sensors or circuits at Absolute Zero temperature (supercooled) produce no noise. Check out liquid Nitrogen cooled electronics in some of the big Astronomy radio telescopes.

Now in simple terms, when you capture something in a sensor and convert it to an electrical signal, it must be processed and amplified. The processing can include light response correction (or Gamma correction). Amplification and or digital conversion plus all the other stuff like white bal. and sharpening will be done in stages in a particular order and there is a well known equation which calculates the sum of all individual added noise. Put simply this equation is logarithmic, which means in an amplifier system, any noise introduced at the lowest signal end i.e the sensor will have the most effect on noise at the output.

In camera parlance, sensitivity can mean electronic amplification for a good useable signal or image. What I am saying is if the sensor and first amplifier or digital register is producing thermal related jitter in analogue or digital representation, then amplifying the signal (or changing its digital bit weighting) will amplify the unwanted added signal error we call noise. So you have more signal, but the same proportion of noise. I am missing a lot out, because modifying digital weighting in the early stages of signal acquisition can dramatically worsen noise, so designers think carefully about how much amplification is needed at which stages and where to put it for lowest noise.

Therefore, to achieve 'usable' amplification or sensitivity, you must reduce noise at the 1st stage which is the sensor. You can supercool it, design the physics of the pixel elements to reduce noise, or increase the effective surface area.

Sony produce their HyperHAD video cam chips where they add a micro lens to each pixel. The surface area is most important. I said 'effective' because if you had 1 Mpix small area pixels spread out over a large area, there's no improvement. This comes by maximising the pixel element area density. So if you are looking at a full frame DSLR, this could come from fewer (lower res.) larger pixels filling the area or many more (higher res.) pixels with the least wasted space between adjacent pixels. Imagine a small 5Mpix sensor, duplicated over a full 35mm frame area - that would be one super (expensive) Hi res sensitive chip. But now we have options. Because it's area that counts, we can sacrifice resolution and combine pixels electrically (increasing unit area) and increase sensitivity. So there are some things to think about when looking at full frame dslr's. Have they really increased effective area and gained useable ISO sensitivity, or just populated the frame with a similar Mpix and size as in a consumer cam.

Pixel combining is what most cams do that offer 800-1600 ASA equivalent sensitivity. The problem is that noise is the limiting factor because the effective pixel area is still small. Manufacturers will get the pixel density as high as possible on small chips to keep fabrication and lens cost down in compact designs. But there comes a physical limit when adopting improved sensor physics, increasing area, or the HyperHAD lens trick is all that's left.

This is why, with present small sensor sizes, usable sensitivity can come from really big lenses e.g f1.8. The lens becomes the first stage (light) amplifier, so noise contributed by the sensor is an order reduced. In a crude film analogy, the 400ASA film and f 2.8 lens will give a worse result in grain or noise terms, than a 200ASA film with a f1.8 lens, where the 'sensitivity' of both is nominally the same. There are many clever processing tricks I have not mentioned, mostly they are fooling the eye by for example reducing noise in the brightness part of pictures to hide the problem in coloured areas. Green is most sensitive and the dominant colour signal in white light, so look at noise performance in saturated blues and reds when comparing cameras and their ISO's. VOX
voxmagna is offline   Reply With Quote
Old Aug 27, 2003, 3:08 PM   #14
Junior Member
 
Join Date: Jul 2003
Posts: 25
Default Wow!

That is too much of information :roll: ... I guess the guy who posted the question in the first place would have got confused and scared enough to request the teacher to change his topic. :P

Any way.. that was very enlightening! :shock:
bhaskarannadata is offline   Reply With Quote
Old Aug 27, 2003, 6:09 PM   #15
Senior Member
 
Join Date: Aug 2002
Posts: 2,162
Default

Ask a searching question get a long answer, or dumb down to a one liner and never get the real answer, but just read the stuff on the box or in the Ad, buy the most Mpix and stay ignorant and happy!
voxmagna is offline   Reply With Quote
Old Aug 27, 2003, 10:01 PM   #16
Senior Member
 
Join Date: Dec 2002
Posts: 430
Default

Isn't the issue of sensitivity (film or CCD) a time related thing, rather than an area thing?

Take a piece of photographic film (at a given ISO/ASA speed) and open the lens in a dark room; eventually the film will integrate enough photons to make an image. The trick is to get enough photons refelected from the subject to land on the film before stray photons (from background radiation and other reflections) hit the film and fog the film such that the the subject image is obscured, i.e., record more signal (subject image) than noise (background clutter). It doesn't matter how big the piece of film is when addressing signal to noise ratio (sensitivity).

Similarly, with digital photo sensors such as CCDs, the photo sensitive area of the imager chip converts photon flux into an electrical charge which accumulates in the photo site for as long as the lens is open or until the charge is transferred to the read out structure. The issue here is that there is also a non-image charge added into the mix from thermally aggitated electrons (noise) which also accumulates during this time. Thus the sensitivity of a digital sensor is a function of its ability to convert and integrate photon induced charge from the subject image faster than the thermally generated noise charge.

So there are (at least) two variables to control which effect sensitivity; the photon efficiency of the photo sensitive element (Sony Super HAD is one approach) and lower the thermal noise (Oly E series or professional devices with actively cooled sensors).

In theory, it is possible to construct a digital imaging sensor with greater sensitivity (higher signal to noise ratio) than can be achieved with silver halide film.

In terms of consumer economic reality however, it can be argued that commercially available film currently has higher sensitivity (and dynamic range) than consumer level digital imagers.
jawz is offline   Reply With Quote
Old Aug 28, 2003, 6:38 AM   #17
Senior Member
 
Join Date: Aug 2002
Posts: 2,162
Default

Jawz... I'm in general agreement with your perspective on what manufacturers will provide at the market price.

You mentioned at least 2 variables contributing to sensitivity, but are you excluding effective area? Since this is something that may be more easily scaled with current sensors and better manufacturing yields? Or are you saying there is more mileage in developing the physics of sensor design and keeping the area small. VOX
voxmagna is offline   Reply With Quote
Old Aug 28, 2003, 11:52 AM   #18
Senior Member
 
Join Date: Mar 2003
Posts: 121
Default

Quote:
Forget ISO and sensitivity in f stop terms. These are only translations of what you are used to in film photography.
Okay, so we are then going to measure sensitivity before any electronic or other modification. Therefore, sensitivtiy can be measured by the area that is used to collect light.

But perhaps here is a real big, perhaps unanswerable question, is what is the base ISO equivelant of one micron of CCD? We all know that digital cameras can increase or possibly decrease(?) the signal that a sensor creates, but unaltered, what ISO equivalnt is one square micon of CCD when compared to one square micron of film? Is that the question better put?


Quote:
In any system where something small is amplified, additional unwanted electrical 'disturbances' we call noise will be added. This derives from the physics of semiconductors and the temperature of the devices. It has a lot to do with equations and Boltzmans Constant - but the physicists will explain that. Theoretically, sensors or circuits at Absolute Zero temperature (supercooled) produce no noise. Check out liquid Nitrogen cooled electronics in some of the big Astronomy radio telescopes.

Now in simple terms, when you capture something in a sensor and convert it to an electrical signal, it must be processed and amplified. The processing can include light response correction (or Gamma correction). Amplification and or digital conversion plus all the other stuff like white bal. and sharpening will be done in stages in a particular order and there is a well known equation which calculates the sum of all individual added noise. Put simply this equation is logarithmic, which means in an amplifier system, any noise introduced at the lowest signal end i.e the sensor will have the most effect on noise at the output.
Yes, I agree with all that.


Quote:
In camera parlance, sensitivity can mean electronic amplification for a good useable signal or image.
Ok, I think I am begining to get a better understanding of sensitivity. It's not really about how much signal is going into the camera, but rather how much usable signal is coming out of the camera.


Quote:
Therefore, to achieve 'usable' amplification or sensitivity, you must reduce noise at the 1st stage which is the sensor. You can supercool it, design the physics of the pixel elements to reduce noise, or increase the effective surface area.
This is rather interesting. From what I am getting out of this statement, a camera with cooling systems or other means of reducing noise in the early stages could be called more sensitive than another camera with the same lens and CCD.

Quote:
Sony produce their HyperHAD video cam chips where they add a micro lens to each pixel. The surface area is most important. I said 'effective' because if you had 1 Mpix small area pixels spread out over a large area, there's no improvement. This comes by maximising the pixel element area density. So if you are looking at a full frame DSLR, this could come from fewer (lower res.) larger pixels filling the area or many more (higher res.) pixels with the least wasted space between adjacent pixels. Imagine a small 5Mpix sensor, duplicated over a full 35mm frame area - that would be one super (expensive) Hi res sensitive chip. But now we have options. Because it's area that counts, we can sacrifice resolution and combine pixels electrically (increasing unit area) and increase sensitivity. So there are some things to think about when looking at full frame dslr's. Have they really increased effective area and gained useable ISO sensitivity, or just populated the frame with a similar Mpix and size as in a consumer cam.
Okay now you've thrown me for a loop. I used to believe that when a camera manufacturor such as Canon produced a 3.15mp CCD on a 2.7" chip, the sensors had to be smaller than a 2.0mp CCD on a 2.7" chiip. What you are saying is that Canon may actually produce the same sized sensors on each camera, yet on the 2mp, they are just spaced farther apart? This is shocking!

Why can't camera manufacturors increas the size of the individual photo sensor according to resolution and size of CCD? Are they just cutting cost?

Quote:
Green is most sensitive and the dominant colour signal in white light, so look at noise performance in saturated blues and reds when comparing cameras and their ISO's.
Yeah, I had noticed that where my blue and red channels are always noise ridden, the green channel is virtually noise free. Now I understand why.

This is becoming a very interesting topic. Where did you learn all this stuff? Are you and electronic enginneer, or do you just do a lot of reading on the net?

Dan O.
Melboz99 is offline   Reply With Quote
Old Aug 28, 2003, 5:19 PM   #19
Senior Member
 
Join Date: Nov 2002
Posts: 332
Default

If a digital camera falls in the forest and there is no film camera there to take a photo, is there really any noise?
fporch is offline   Reply With Quote
Old Aug 28, 2003, 7:55 PM   #20
Senior Member
 
Join Date: Aug 2002
Posts: 2,162
Default

Melboz:

'....Yeah, I had noticed that where my blue and red channels are always noise ridden, the green channel is virtually noise free. Now I understand why. .............

There are people who object to long posts - ignorance makes up for post purchase dissonance I suppose.

On the Green channel issue, this arises because in the camera RGB matrixing, white light consists of approx 0.6 Green + 0.1 Red+0.3 Blue. Gamma correction changes this, but green is the largest signal component, therefore has the best signal to noise ratio. If you look at the normal looking bell shaped CIE curve for white light with red wavelengths at one end and blue at the other, you will see green and yellow sat near the peak. There is more energy here, so more energy means more signal sensitivity. If you take the green channel in Photoshop and just sharpen that, you will be close to the theoretical luminance channel since Green is 0.6 of the brigtness or grey component in a pic.

From what I have said, you can see that amplifying a colour signal will always give poorer noise in the red and blue channels - that's the place to compare ISO performance between different cameras! But most sensors use more blue and red pixels in the matrix to improve this. So ask yourself what you might be getting in a 'XMpix' sensor. There are a number of permutations which will offer different sensitivities and noise performance.

Colorimetry is also a very complicated subject, much of the work is based on how we actually see colour, how it can be consistently measured and and how a particular sensor technology or emission display, departs from an ideal response and faithful colour reproduction has to be compromised. We just cannot make anything yet that works as good as our eyes.

........What you are saying is that Canon may actually produce the same sized sensors on each camera, yet on the 2mp, they are just spaced farther apart? This is shocking! ............

Why do you think hi res crt tubes are more expensive for the same glass size?? The dot pitch is reduced to fit in more dots/pixels which also causes a problem with physical construction. There are also limits as to how dense you can mask and fab devices on to silicon. When you look on the box to see what your GFX adaptor will do, think about whether your display would ever realise it at usable brightness.

I apologise to Katina for the lengthy posts, but as you can see, a simple one line answer will be a dumb one! but perhaps somebody else can offer a dummies reply where I have failed. VOX
voxmagna is offline   Reply With Quote
 
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off



All times are GMT -5. The time now is 8:24 AM.