Steve's Digicams Forums

Steve's Digicams Forums (https://forums.steves-digicams.com/)
-   General Discussion (https://forums.steves-digicams.com/general-discussion-11/)
-   -   What is the Percentage Range? (https://forums.steves-digicams.com/general-discussion-11/what-percentage-range-168336/)

Bynx Mar 30, 2010 7:56 AM

What is the Percentage Range?
 
If you consider the sun as being 100% white, and a totally black area as 0%, what is the % range that a camera can capture as (1) a jpeg, and (2) a Raw file. For arguments sake, lets say the camera is a 10 Megapixel. Are there any factors which would change the percentages? Like the sensor size or the size of lens. It seems when bracketing a shot to create an HDR image the percentage range is increased from the optimal single jpeg or Raw file.

Mark1616 Mar 30, 2010 8:40 AM

Not a clue but interested to find out the answers..... although I'm guessing the Sun is not pure white, but we can use the assumption it is for this process (I could be wrong).

Bynx Mar 30, 2010 8:51 AM

I guess what we are talking here is brilliance rather than color. Or are we talking temperature? Anyway the sun is the brightest and the shadiest is the darkest.

pbjunkiee Mar 30, 2010 10:29 AM

well if you think about it, when you look at levels in photoshop, the left side is white and the right side is black, so you can get a true black,and a true white in an image, im guessing it can pick up the entire range?

TCav Mar 30, 2010 12:15 PM

The Sun isn't 100% white. That's why the sky is blue and sunsets are red. Magnesium Oxide is 100% white (well, 99.996% white, actually), even if it's in a dark room.

Digital cameras are capable of recording images that range in brightness of from 2 to 20 EV, but not all at the same time. They have a dynamic range of from 9 to 11 EV (from dimmest to brightest), but a good deal of that dynamic range is lost during processing and the conversion of 12 or 14 bit data to 10 or 8 bit data in JPEG images.

Bynx Mar 30, 2010 5:20 PM

Ya Im not talking about color when I mentioned the sun. I just figure its the brightest thing to shoot. And absolute darkness would be the darkest thing. So if one is 100% and the other is 0% then the range of a single jpeg should be say from 15% to 80%. A Raw file might go 10% to 90%. No single file can go from 0 to 100. Now TCav you talk about EV. So that can translate to percentages right? It doesnt seem the dynamic range of any shot is very much compared to how much there is in a single full range scene with the sun shining and lots of deep shadows. The reason I ask this is that when I take shots for HDR use if I underexpose a lot for the sun and overexposure a lot for the shadows, then the range has to be a lot more than any file, jpeg or Raw. If this is true then HDR has to be an improvement in any shot with a long range.

Mark R. Mar 31, 2010 3:47 AM

Quote:

Originally Posted by TCav (Post 1072665)
The Sun isn't 100% white.

This is, in itself, true. The sun's light is a mixture of emission spectra of hydrogen, helium, carbon, etc. But it's almost white, i.e. it contains almost all visible wavelengths. If you look at a spectrum of the sun's light, there are only a few wavelengths missing. (These are absorbed by gases in the sun.)

Quote:

Originally Posted by TCav (Post 1072665)
That's why the sky is blue and sunsets are red.

With all due respect - this is nonsense. Even if the sun were a perfect white (containing all visible wavelengths in its spectrum), the sky would still be blue, and sunsets still be red. These sky colorations have nothing to do with the "whiteness" or spectral make-up of the sun, but rather they are atmospheric effects, having to do with the stronger scattering of short wavelengths. Blue light is scattered sideways, by the air molecules and dust - hence the sky is blue. In the evening, the sunlight travels a long distance through the atmosphere, hence most of the blue and green light is scattered away. What remains, is yellow, orange and red.

Quote:

Originally Posted by TCav (Post 1072665)
Magnesium Oxide is 100% white (well, 99.996% white, actually), even if it's in a dark room.

Magnesium oxide, a white powder, is not a light source. It is a light reflector. It can only reflect what it receives. If you put it in sunlight, it will reflect sunlight. To my knowledge, the best white reflector is Titanium dioxide.

However, what you may be referring to, is magnesium that is actually burning. Because the flame is so hot, the material emits black-body-radiation across pretty much the whole visible spectrum, making it a near-perfect white.

Regards,
Mark

TCav Mar 31, 2010 5:48 AM

The earth's atmosphere scatters short-wavelength light more than long-wavelength light. When the sun is high in the sky, violet and blue light is scattered over the entire sky, making it blue, and the wavelengths of light that are not scattered make the sun appear yellowish. When the sun is lower in the sky, it's light travels through more of the atmosphere so more of it's light is scattered, making the sun, and the sky around it, orange and red.

Magnesium Oxide, along with Magnesium Carbonate and Barium Sulphate, is a frequently use standard for white because of it's near perfect reflectance factor. Titanium dioxide is commonly used as a pigment in paint and food, but it has an absorption band around 400nm, so it's "warm" and isn't used as a standard for 'White'.

Bynx Mar 31, 2010 6:28 AM

Well, two pretty smart guys duking it out over the sun and the sky. How about answering the question. I think its an important one and should be easy enough to answer. Its a technical question and should apply to all cameras.

Mark1616 Mar 31, 2010 6:49 AM

Well I think the thing we need to know is what is the difference in EV between the sun and the inside of a closed box..... would that give the scale?

TCav Mar 31, 2010 6:51 AM

I would like to think that we all know enough not to point our cameras directly into the sun, so I think that even the sun is more brightness than we need to account for in out photography.

There is no 100% bright; there's only the maximum amount of brightness that can be recorded by a sensor. That's why the EV scale is open-ended. What you're talking about is the dynamic range of an image sensor, and that's what HDR is for: to get around perceived shorcomings of the combined effects of image sensors and output devices in this area.

Mark1616 Mar 31, 2010 6:54 AM

I think he is saying, make the assumption the sun is 100%, not stating categorically that it is. So it is a theoretical range using things we can all appreciate.

Apart from that I can't 'shed any light' on this topic.

Mark R. Mar 31, 2010 10:00 AM

Quote:

Originally Posted by Bynx (Post 1072980)
Well, two pretty smart guys duking it out over the sun and the sky. How about answering the question. I think its an important one and should be easy enough to answer. Its a technical question and should apply to all cameras.

OK, while I could go on about atmospheric refraction and how it has nothing to do with the whiteness of the light source, let me try to contribute to an answer for the original question.

I would suggest, for starters, to consult definitions of illuminance and EV.

Illuminance is measured in lux, defined as 1 lumen per square meter. It is therefore a measure of light flux per unit area.
See also:
http://en.wikipedia.org/wiki/Illuminance
From there, I quote:
Quote:

The human eye is capable of seeing somewhat more than a 2 trillion-fold range: The presence of white objects is somewhat discernible under starlight, at 510^−5 lux, while at the bright end, it is possible to read large text at 10^8 lux, or about 1,000 times that of direct sunlight, although this can be very uncomfortable and cause long-lasting afterimages.
There is also a table of typical lux values at
http://en.wikipedia.org/wiki/Lux

On the other hand, EV is a scale based on factors of 2, so for every EV step, the amount of incident light doubles.

Comparing 10^−5 lux (starlight) to 10^5 lux (direct sunlight), this is a factor of 10^10. On a scale of powers of 2, this corresponds to about 30 EV steps.

Remember that the scale is open on both ends, so in terms of illuminance, a truly "black box" would read "ten to the power of negative infinity". So that's not really practical. You have to choose a lower and upper threshold, which is what I've tried to do with the examples from the lux table.

Caveat: illluminance is not the same as irradiance, which measures the physical power of the radiation. I'm not sure inasfar one should rather use irradiance to do the above calculation.

Also, none of us would use the same camera settings to shoot direct sunlight and a closed box. So I'm actually not quite sure I understand the "actual question", Bynx.

If there are photons, no matter how few, if you leave your aperture open for long enough, you'll get a picture. Conversely, if you make your aperture small enough and your shutter fast enough, direct sunlight won't over-saturate your sensor.

So what exactly is the question that we're supposed to answer?

Regards,
Mark

mtngal Mar 31, 2010 3:29 PM

I can contribute nothing to the technical nature of your question, only a practical one from playing around with several different cameras. So, in the "for what it's worth" category, here are some thoughts about what I understand to be Bynx's question/concern/point, based on his second post, mentioning about HDR vs. raw vs. jpg.

Photography is more about reflected light (how light reflects off of objects) than it is about direct light (light coming from a source such as a lightbulb or the sun). So a lot of the discussion about light, which I personally find quite interesting from an intellectual point of view, seems to not quite answer Bynx's question, which is essentially, can a raw file practically capture enough information for all dynamic ranges in a possible picture. Since most people don't shoot light emitters as a subject/intentionally (well, there are a number of exceptions, such as pictures of cool neon signs), just light reflectors, I think his answer lies more with practical photography. Reflected light might be a smaller dynamic range than that of complete black to the brightest light possible (i.e., sunlight?), it's still a huge range.

I can't come up with figures or research - that's not my "thing". However, I have shot both jpg and raw and compared them. I've shot pictures in situations with a huge amount of dynamic range (I do that a lot as a matter of fact). I've played around a bit with HDR also.

My practical experience is this - raw will give you an edge over jpg. But it isn't all that much more - just as a guess, it gives maybe a half to one stop more on the bright side (at the most, and my current camera might not even give me that much) and maybe 1-2 stops more in the shadows (maybe a bit more). As soon as you start "pushing" the shadows, you introduce noise and the more you push, the more noise you get, and the harder it is to see details. And if the highlights are blown out, they are gone no matter what - raw will not give you something that's not there.

So your premise that in situations where there is a huge range of light/dark that HDR will do better than a single raw file is (in my experience) definitely true. I have a series of photos that I did as HDR that came out very nicely. When I decided that they would look even better in b&w, I tried to use the middle exposure only, since I didn't care if the sky was blown out/white. The only problem was that the middle exposure had some of the building blown out and even using the raw file didn't bring back detail in that spot. So I took the first underexposed file that had detail in those spots, and tried to lighten the shadows enough to bring out the detail in the other parts. The noise was so significant that running it through noise reduction software couldn't retain the fine details and the noise was so significant it was very distracting, even when converted to b&w. At that point I got frustrated and just did a b&w conversion of the HDR file, which contained information in both the brightest and darkest parts of the picture and had no noise.

My conclusion - jpg works well for many things. Raw works better than jpg but is still limited. HDR is the only way to go for certain scenes, and can be the closest to what my brain interprets a scene to be, and is the only way to capture detail everywhere.

But that goes only for scenes where there's a very big dynamic range that you want to capture detail in. The vast majority of most people's subjects don't require it. And sometimes the smaller dynamic range capability of the camera sensor works in my favor, rendering an OOF background as black where it might have been ugly dark green blotches.

Going on to another part of your original question - I'm not so sure that overall sensor size affects how much dynamic range a camera is capable of capturing. I think the pixel rating has more of an influence - my old Sony F717 with it's 5 mp small sized sensor was quite capable of capturing a pretty wide dynamic range, more so than the 8 mp Panny FZ30 I briefly owned (the sensor sizes were, I think, similar). I've often wondered if a sensor's dynamic range capability were connected to its noise sensitivity - as sensors become noisier they can't capture detail in as big of a range (could that be manufacturers trying to deal with the increased noise, by cutting down the dynamic range so you don't see the noise in the shadows? I don't know).

And how much does a lens contribute to a picture's dynamic range? It can, but doesn't always, have an effect. I have two lenses, both high quality. One is a zoom and one a prime. The zoom is a much bigger lens, bigger front element (not surprising due to the complexity of making a constant aperture f2.8 zoom lens). The prime has a small front element and is (relatively) tiny, even though it is faster (f1.8). The prime has a greater dynamic range and renders blacks much better than the zoom with noticeably better contrast. Another prime lens I have is an elderly one with no coatings. It's definitely low-contrast and is hopeless when shooting sunsets and bright colors. But it doesn't over-saturate flowers, handles green very well and its really sharp so it can give details where other lenses make things look more smooth, due to over-saturation of bright colors. So yes, my experience is that lenses do make a difference - but not a huge amount, certainly not as much as the sensor does.

TCav Mar 31, 2010 4:18 PM

When you look at something, your iris adjusts the diameter of the pupil so that you receive enough light so you can see and recognize the detail on whatever it is you're looking at. Other nearby objects that may be brighter are darker are not as visible because you aren't trying to look at them. When you shift your gaze to those other objects, your iris readust so you can, but then what you looked at originallly is less visible.

When you take a photo, the camera's autoexposure system adjusts the exposure settings to capture an image with visible detail on whatever you pointed the camera at. When you look at other areas of the image that were not properly exposed, there is no way to go back an properly expose that part of the image, like there is with your eye.

HDR is a way to create an image that is more properly exposed from edge to edge, corner to corner, that is not possible any other way, and indeed, isn't possible in real life.Image sensors have a dynamic range of about 9 to 11 EV, which is plenty, but when 14 or 12 bit RAW data is converted to 10 or 8 bit images, and then displayed on 10 or 8 bit sRGB monitors, that dynamic range is lost. HDR is an attempt to compress the dynamic range of a scene so that it can survive all that. The dynamic range of any single component is irrelevant since there are so many bottlenecks to the process.

Bynx Apr 1, 2010 8:16 AM

The purpose of my question was to have the relevance of HDR brought out. I was getting a little pissed at those couple of guys who kept saying they could do it better this way or that way and HDR wasnt necessary. Raw was superior and better than going the HDR route. HDR is the only way to have a range which closely matches the range of the human eye. Tcav nicely points out the eye looking at one spot and adjusting, then moving to another darker or lighter spot and readjusting. Well in an HDR photo all the adjusting has been done so its just look around whether its light or dark. While I didnt get a precise answer to my question, its been answered in a practical way. Thanks to the three of you.

ac.smith Apr 1, 2010 2:21 PM

Quote:

Originally Posted by Bynx (Post 1072980)
I think its an important one and should be easy enough to answer. Its a technical question and should apply to all cameras.

It is not anywhere near as simple as you make it out to be. The first problem is that you are asking for an answer on a linear scale (percentages) while we typically measure light photographically on a logarithmic scale and astronomically on an inverse logarithmic scale.

The visible (apparent) order of magnitude (inverse logarithmic) for the sun is -26.73 and the darkest sky on earth (which may not be true black) is about 9 (http://en.wikipedia.org/wiki/Apparent_magnitude). A very coarse photographic approximation might be on the order of 36 EV.

Based loosely on DPReview's camera dynamic range tests a number of jpeg engines on DSLRs manage about an 8.5 EV dynamic range and at least some manage 10 EV in raw. That would seem to suggest that three RAW exposures spaced at 9 stop intervals might capture the dynamic range the OP referenced when processed as HDR. Fortunately, as stated above, most scenes do not have anywhere near that dynamic range.

A. C.

Bynx Apr 1, 2010 3:45 PM

Well ac, I sure didnt mean all that easy. But there are constants we are dealing with so there is an answer. A jpeg will cover 9 EV of the possible 36 EV which will actually be closer to 30. A Raw file would cover 12 EV?

iowa_jim Apr 1, 2010 6:28 PM

Don't look a gift horse in the mouth.

TCav Apr 1, 2010 8:26 PM

The Wikipedia article on HDR lists the dynamic range of a Canon EOS-1D Mark II. At ISO 50, it has a dynamic range of 11.3 EV, it peaks at ISO 100 with a dynamic range of 11.6 EV, and at ISO 3200, it has a dynamic range of only 8.7 EV.

http://www.dxomark.com/ provides similar values for other cameras.

VTphotog Apr 1, 2010 9:15 PM

To me, the practical advantage of HDR is that by having at least one of the frames exposed for the shadows, the noise in the darker areas will be reduced. To do this with a single RAW image, as some do, means having to amplify the dark areas more, amplifying noise as well. If you then do noise reduction, you also eliminate some of the detail.
Tone mapping of RAW files is equivalent to using a higher ISO setting for the dark areas, with concomitant noise.

brian

Alan T Apr 2, 2010 6:41 AM

Quote:

Originally Posted by Bynx (Post 1073629)
Well ac, I sure didn't mean all that easy. But there are constants we are dealing with so there is an answer. A jpeg will cover 9 EV of the possible 36 EV which will actually be closer to 30. A Raw file would cover 12 EV?

You have hit the nail on the head here Bynx, my old friend. I may be wrong but I think this question is much simpler than everyone is making it, because we're dealing with digital data recording.

Let us consider for the sake of argument a B&W jpg image. The blackest black it can record has RGB values of (0,0,0), and the whitest white has (255,255,255). What light intensity relates to each of those 256 levels of grey (8-bits) depends on the exposure, and how it's recorded by the particular sensor we happen to be using. The bigger the dynamic range of which the sensor is capable, the more shades of grey could in principle be recorded. But we are going to put them straight into a jpg file with only 256 levels, or a RAW file with however many it can store.

We can record more levels by adjusting the exposure up and down, and doing HDR on them. How many we record altogether depends on how many exposures we include between the longest and the shortest, and how much patience we have.

At the short exposure end, the most light we'll encounter entering our notional lens is, as discussed , a direct shot of the sun's surface from outer space, and there'll be more the closer we get. In practical terms on Earth, I suppose it'll be on top of high mountain. However, our sun isn't a particularly hot star.

At the long exposure end the dimmest we'll get is no photons at all, and to get there we need a subject at absolute zero. The ratio between that and the intensity of any light at all is infinite, and Mark R has already given us the theoretical answer to Bynx's question - infinity percent.

One practical limit would be how long you're willing to wait for your long exposure to end. It'd be best to finish before the sun swallows the Earth, as it will in due course.

We really need a competent astrophysicist plus a current optical astronomer to answer this question realistically. Someone in an observatory on top of a high mountain should be able to tell us what is the intensity ratio between the brightest and dimmest objects they can practically observe. The low limit will be set by the thermal noise in the liquid helium-cooled detectors.

Prof. Brian Cox would do nicely for half the team if anyone wants to drop him a line. He works on the large Hadron Collider at CERN, Geneva, and has recently been presenting a TV series here the Solar System, in which he occasionally waxes lyrical, in his other capacity as a retired rock musician.
( http://en.wikipedia.org/wiki/Brian_Cox_(physicist ).

TCav Apr 2, 2010 7:30 AM

If we stick with B&W (for the sake of simplicity), then an 8 bit image is limited to 256 descrete grays, within the dynamic range of the image. HDR doesn't change that. It just brightens the darker areas of the image, and darkens the brighter areas, so detail in those areas can be seen instead of being lost in the shadows and highlights. The image is still limited to 256 descrete grays; we're just arbitrarily picking and chosing the exposure values we want applied to different areas of the image. We don't record more levels by adjusting the exposure settings; we just choose to position those levels at different places within the dynamic range of the image.

A 12 bit (4096 descrete grays) or 14 bit (16384 descrete grays) image simply splits the dynamic range of the image sensor into finer parts. It doesn't increase it.

ac.smith already gave us a significant list of the brightest and dimmest objects in the sky, in http://en.wikipedia.org/wiki/Apparent_magnitude. But these are all sources of light (either directly or indirectly.) There is no value given for the "apparent magnitude" of interstellar space, so even this doesn't answer the question. The dimmest object in the night sky is still transmitting or reflecting light, so while it is quite dim, it isn't actually "dark".

ac.smith Apr 2, 2010 7:58 AM

Quote:

Originally Posted by Bynx (Post 1073629)
Well ac, I sure didnt mean all that easy. But there are constants we are dealing with so there is an answer. A jpeg will cover 9 EV of the possible 36 EV which will actually be closer to 30. A Raw file would cover 12 EV?

Sorry, depending on memory and mental math. Looked again at DPReviews most recent tests. The best current jpeg engines across the board on 4:3, APS-C and full frame seem to deliver between 8.5 and 8.8 EV range although some still don't reach 8. Again, across the same range of sensors RAW dynamic range is about 12. So we could probably achieve a 36 EV range HDR with 3 raw frames and might just make it with 4 jpeg frames and definately with 5. We would also need ND filters on one end of the exposure range and long exposures on the other.

Granted your question is a hypothetical one but would we want to picture the sun, high in the sky as just a round orb?

A. C.

ac.smith Apr 2, 2010 8:21 AM

Quote:

Originally Posted by TCav (Post 1073845)
ac.smith already gave us a significant list of the brightest and dimmest objects in the sky, in http://en.wikipedia.org/wiki/Apparent_magnitude. But these are all sources of light (either directly or indirectly.) There is no value given for the "apparent magnitude" of interstellar space, so even this doesn't answer the question. The dimmest object in the night sky is still transmitting or reflecting light, so while it is quite dim, it isn't actually "dark".

In a more simplified manner I caveatted the brightness (darkness) as you have. My choice of 9 magnitude as black was was based on the 7-8 being the dimmest object discernible in the darkest sky on earth with the unassisted eye. 9 is therefore half the brightness of the least discernible object but not 0. None the less about as black as we'll encounter in nature.

In the past absolute black was simulated in labs with a closed, curved, tapered tube (specifically a cow's horn) painted glossy black on the inside. Why glossy? The flattening agent in paint (silica - sand) is reflective.

A. C.

ac.smith Apr 2, 2010 8:49 AM

Alan T

Good to see you chime in, thanks. As you can tell my issue with the original post was asking for a linear conversion on quantities we typically measure on a logarithmic scale. Nor has anyone attempted that yet although it's not germane to the real question of quantifying the dynamic range advantages of HDR.

A. C.

TCav Apr 2, 2010 9:34 AM

Quote:

Originally Posted by ac.smith (Post 1073869)
In a more simplified manner I caveatted the brightness (darkness) as you have. My choice of 9 magnitude as black was was based on the 7-8 being the dimmest object discernible in the darkest sky on earth with the unassisted eye. 9 is therefore half the brightness of the least discernible object but not 0. None the less about as black as we'll encounter in nature.

Yes, but that presupposes a Persistence of Vision of 1/25 second, which doesn't apply in still photography. If you use a shutter speed of 1/12 second, then objects of magntitude 9 become visible, and if you use a shutter speed of 1/6 second, magnitude 10 objects become visible, etc. (...presuming an appropriate dynamic range.) And for HDR photography, such shutter speeds are not unheard of.

Alan T Apr 2, 2010 10:06 AM

Thanks, 'Tcav' and 'ac.'

Right, now we're getting somewhere (even if it is rather a discussion about the number of angels that can dance on the head of a pin).

So, to give Bynx a reasonably quantitative answer, I suggest we find out the magnitude of the dimmest astronomical object ever detected with an optical telescope (and therefore probably 'photographed' (so-to-speak) with a fancy digicam for publication somewhere) and give him the number (plus 1.0 so that it's optically invisible) of orders of magnitude between that and the brightest object (presumably the Sun).

Then he can choose how many exposures to take, to cover that spread of dynamic range, and compress them artistically (or auto-HDR them) into a nice picture. I fear it might take a while.

It would be much more fun if we organised a trip to a mountaintop in Hawaii (a) to meet selvin, and (b) to see for ourselves.

ac.smith Apr 2, 2010 10:37 AM

Quote:

Originally Posted by TCav (Post 1073898)
Yes, but that presupposes a Persistence of Vision of 1/25 second, which doesn't apply in still photography. If you use a shutter speed of 1/12 second, then objects of magntitude 9 become visible, and if you use a shutter speed of 1/6 second, magnitude 10 objects become visible, etc. (...presuming an appropriate dynamic range.) And for HDR photography, such shutter speeds are not unheard of.

Absolutely, photographic technology can and has over it's history made visible what is not visible to the unassisted eye.

A. C.

TCav Apr 2, 2010 10:52 AM

But, once again, we wouldn't want to have direct sunlight in a photo, and we certainly wouldn't want to visually compose an image that included the sun, without using at least an ND4 filter. So we should subtract 4 from the magitude of the sun. ... or rather, add 4 to it.

Bynx Apr 2, 2010 11:46 AM

Quote:

Originally Posted by ac.smith (Post 1073851)
Granted your question is a hypothetical one but would we want to picture the sun, high in the sky as just a round orb?

A. C.

Actually I would. Id like to see a shot of the sun with solar flares and any texture it has, just like we see the moon.

Im getting lost in the discussion of interstellar light etc. In an outdoor scene on a bright sunny day if an exposure is made for an optimum image coverage as much of the range as possible, what would that range be? Its a given that Raw would be more than jpeg. So far as I understand it, that range would be 8 or 9 EV for jpeg and 12 for Raw. Now bracketing jpeg would increase that to a lot more than a Raw file. Now if we use our eyes in the same scene and allow them to adjust from the brightest to darkest areas, what range would our eyes have? And its equivalent to our best HDR, correct?

TCav Apr 2, 2010 12:30 PM

Actually, "interstellar" would be "dark", not "light".

The human eye has a dynamic range of about 90dB, but not all at the same time. The size of the pupil affects the dynamic range. Pupil sizes range from 3mm to 9mm. That would imply that 90% of the dynamic range of the eye is attributable to the action of the iris. It seems to me, therefore, that the eye with a fixed pupil diameter would have a dynamic range of at least 10 EV.

I have no doubt that this is a gross oversimplification, and welcome your comments and criticisms.

ac.smith Apr 2, 2010 1:57 PM

Quote:

Originally Posted by Bynx (Post 1073947)
Actually I would. Id like to see a shot of the sun with solar flares and any texture it has, just like we see the moon.

Im getting lost in the discussion of interstellar light etc. In an outdoor scene on a bright sunny day if an exposure is made for an optimum image coverage as much of the range as possible, what would that range be? Its a given that Raw would be more than jpeg. So far as I understand it, that range would be 8 or 9 EV for jpeg and 12 for Raw. Now bracketing jpeg would increase that to a lot more than a Raw file. Now if we use our eyes in the same scene and allow them to adjust from the brightest to darkest areas, what range would our eyes have? And its equivalent to our best HDR, correct?

Again you have two questions/comments. Pictures of solar flares, sunspots and other solar features starts off with HEAVY filtration to get the absolute values into a range that can be handled with our viewing or photograph equipment. If I were interested it would be straight to astronomy sources for me. HDR might be applicable but it would be well out of normal application of the process.

Now back to "normal" applications. I like TCav's estimate of the eye's dynamic range of EV10 and will use that for the rest of my comments.

A raw file will handle that (and this assumes the presentation media will handle it as well.) If the presentation is a print the most jpeg engines should handle the range actually available. This is the description of a scene in which the iris is not actually making any adjustments.

A scene in which the iris is adjusting will require both assumptions and simplifications. In this case I am assuming an outdoors, front lit scene between 2 hours after sunrise and 2 hours before sunset. Our hypothetical scene contains areas of interest that are both fully illuminated and those in open shade. Some "middle" gray for the fully illuminated portion would be EV15 and in the open shade area the same gray would be EV12. Applying TCav's estimate the eye is responding to EV7 (12-5) through EV20 (15+5) for a dynamic range EV13 including the iris's action. Even a raw photo doesn't quite handle this but 2 raw frames (DR 24EV) or even 2 jpeg frames (DR 18EV) HDR processed would. Including snow or light sand (middle gray 16) would raise the DR to 14 but would still be within 2 raw or 2 jpeg range.

This is simply a way to approach the question/problem. There are obviously scenes that could encompass a greater DR but I'll leave it to others to describe/analyze them. If someone has better numbers for any element of this analysis feel free to jump in.

A. C.

TCav Apr 6, 2010 6:45 AM

Yes. You're absolutely correct. That's the purpose of HDR: to represent the large dynamic range of a scene within the small dynamic range of the medium. You're compressing the dynamic range of the scene so that it will fit within a compressed 8 bit JPEG.

What HDR does for photographs is very similar to what dbx and Dolby did for audio. During recording, they compress the audio spectrum so the highs and lows aren't stored on top of, and become indistinguishable from, noise from sources in the equipment, and they amplify the soft sounds so they stand out from noise from sources in the recording medium. Then, at playback, the soft sounds that were amplified are reduced to their original levels, and the audio spectrum is expanded to its original dynamic range. In that way, the audio is truer to the original, and isn't corrupted by problems inherent in the recording process. This technique is referred to as companding. (Much of this technique has become unnecessary with the introduction of digital recording devices, though Dolby has kept pace with Dolby Digital technologies.)

musket Apr 6, 2010 4:36 PM

Here's an article that might explain more http://www.cambridgeincolour.com/tut...amic-range.htm

Bynx Apr 6, 2010 5:21 PM

Excellent article Musket, Thanks.

JimC Apr 9, 2010 6:34 AM

Quote:

Originally Posted by ac.smith (Post 1073851)
The best current jpeg engines across the board on 4:3, APS-C and full frame seem to deliver between 8.5 and 8.8 EV range although some still don't reach 8.

They tested JPEG images at 9.4 stops from the Sony A550 at ISO 200 (with it's DRO feature set to off). The full frame Sony A900 was also able to deliver 9.4 stops at ISO 200, and was still delivering a DR of 9.1 stops or higher through ISO 1600 with it's jpeg images.

ac.smith Apr 9, 2010 9:08 AM

Quote:

Originally Posted by JimC (Post 1076930)
They tested JPEG images at 9.4 stops from the Sony A550 at ISO 200 (with it's DRO feature set to off). The full frame Sony A900 was also able to deliver 9.4 stops at ISO 200, and was still delivering a DR of 9.1 stops or higher through ISO 1600 with it's jpeg images.

Yes, the A550 and A900 Sony seem to deliver impressive (class leading) DR in jpeg. The A900 also seems to have much better than most raw DR while the A550 seem to be a bit less than the norm. I say that cautiously on the A550 since they're labeling the raw chart a bit differently. The A380 DR, both jpeg and raw, seems to more like the rest of the marketplace.

A. C.


All times are GMT -5. The time now is 10:52 AM.