Go Back   Steve's Digicams Forums > Digicam Help > Newbie Help

Reply
 
Thread Tools Search this Thread
Old May 3, 2007, 11:00 AM   #1
K J
Junior Member
 
Join Date: Jul 2006
Posts: 15
Default

Is it possible to fully display what is in RAW image (all those 14 bit or so)? There are already some high contrast displays, so where lies the problem? Is it only software issue?
cheers
K J is offline   Reply With Quote
Sponsored Links
Old May 5, 2007, 9:18 AM   #2
Senior Member
 
Hayward's Avatar
 
Join Date: Jan 2007
Posts: 1,318
Default

How often do you deal with your emai on a bit by bit level????

That is what RAW is and why so huge.

There are reasons for it you can endlessly manipulate.... and yes you have to handle EVERY image 1 by 1.... it is after all a RAW not FINISHED product.... hense the name.

Now some cameras like my Petax K10D will allow you to shoot RAW + JPG simutainiously..... JPG is good enough just trash the RAW.... or vice versa you have that ability to fix a lot you couldn't otherwise and ditch the JPG.


Hayward is offline   Reply With Quote
Old May 5, 2007, 3:40 PM   #3
K J
Junior Member
 
Join Date: Jul 2006
Posts: 15
Default

Hmm, somehow I can't figure out the answer to my question from your post...
We are getting rid of some information by converting from RAW to 8 bit formats so that we can see them on ours 'ordinary', 8-bit displays. Right? But what if someone has some HDR monitor capable of displaying simultaneously all those bits that are stored in RAW? That is my question - is it possible, what hardware/software would one need? Or am I confusing some things?
K J is offline   Reply With Quote
Old May 5, 2007, 5:51 PM   #4
Senior Member
 
Join Date: Aug 2002
Posts: 851
Default

TIFF or PSD (and probably other file formats) can be 16 bits, and we look at them on our monitors.
I guess I do not understand your question.
amazingthailand is offline   Reply With Quote
Old May 5, 2007, 5:57 PM   #5
Senior Member
 
JDar's Avatar
 
Join Date: Jan 2007
Posts: 956
Default

By definition since the information is RAW, it always has to be interpreted prior to display, so the simple answer is no, RAW itself cannot be displayed without quite a bit of intervention by software. Contrast or other capability of monitors isn't the limiting factor.
JDar is offline   Reply With Quote
Old May 5, 2007, 7:36 PM   #6
Senior Member
 
Bob Nichol's Avatar
 
Join Date: Apr 2003
Location: Eastern Ontario Canada
Posts: 822
Default

I think you are asking if there are monitors that can display 48 bit colour as well as 24 bit?

I'm not sure about the monitors but I have heard of high end ink jet printers that will print 48 bit colour.
Bob Nichol is offline   Reply With Quote
Old May 5, 2007, 8:43 PM   #7
Senior Member
 
Corpsy's Avatar
 
Join Date: Dec 2005
Posts: 879
Default

K J wrote:
Quote:
Is it possible to fully display what is in RAW image (all those 14 bit or so)? There are already some high contrast displays, so where lies the problem? Is it only software issue?
cheers
"High contrast" is a pretty relative term, and also a practically meaningless bit of marketing lingo when used out of context (similar to megapixel). A monitor may be advertised as having a certain contrast ratio, but all that is is a comparison between the darkest value it can recreate and the brightest. If a monitor could display values from 0 lumens to 1, it would still look nearly black no matter what you were viewing, but technically it would have an infinity:1 contrast ratio.

When you take a RAW photo, the camera captures a range of light values as accurately as it can in 12 bit code (stored in a 16 bit container). That 12 bits is actually 12 bits per color channel per pixel, so 36 bits altogether. 12 bits per color channel means 4096 shades of red, 4096 shades of green, and 4096 shades of blue, or 68,719,476,736 different colors.

When you view it on your monitor, it is downgraded to 8 bits per color channel per pixel, or 24 bits altogether. That's 16,777,216 colors. While that is far less than what the RAW contains, do you really think you could detect the difference between color 12,000,000 and 12,000,001? The extra data of a RAW file is for the most part redundant. It's there so that if you make drastic adjustments to an image, say by turning up the contrast 1000%, there's plenty of extra color information in there to prevent posterizing (visible separations in different color shades) and other problems. It also stores a lot more data in the shadow areas allowing for a lot of brightening when underexposed, and more precise color information for cleaner saturation adjustments.

So ultimately, the point of RAW isn't that it's a better representation of the image. The point is that it is detalied enough to allow for significant modifications with far less quality loss than would happen with an 8 bit image. If you were to compare a perfectly exposed 8 bit image to it's 12 bit RAW counterpart, even on a monitor capable of displaying the entire range of RAW colors, it's unlikely you would see any difference whatsoever.
Corpsy is offline   Reply With Quote
Old May 9, 2007, 4:37 PM   #8
K J
Junior Member
 
Join Date: Jul 2006
Posts: 15
Default

Quote:
do you really think you could detect the difference between color 12,000,000 and 12,000,001?
No, but actually that's the goal (to have smooth steps). Look from other pov - it's ONLY 256 shades of grey with 8 bit per channel so it's possible to see posterization then with good monitor.

Quote:
It also stores a lot more data in the shadow areas
Yes, exactly that is my point. Ability to display it simultaneously would make big difference IMO when compering to 8 bits per channel which often lack the dynamic range (clipped highlights/shadows).

Quote:
If you were to compare a perfectly exposed 8 bit image to it's 12 bit RAW counterpart, even on a monitor capable of displaying the entire range of RAW colors, it's unlikely you would see any difference whatsoever.
I still bet that it would make the difference (like in example above).
K J is offline   Reply With Quote
Old May 9, 2007, 5:51 PM   #9
Senior Member
 
Corpsy's Avatar
 
Join Date: Dec 2005
Posts: 879
Default

I'm not sure you're completely understanding what my point is. I'll try to illustrate more clearly.

On your point about 8 bit B&W, that is certainly an exception. 256 shades of gray, while usually enough for most purposes, is rather lacking. That's why a lot of people prefer to store their B&W images as 24 bit color images. Technically there are still only 256 shades of true gray, but adding minuscule amounts of red, green and blue can increase that number by thousands without visibly tinting the image.

As far as being able to see all the shades of color, let me demonstrate something to you.



Here is a very high contrast image with a lot of detail hidden in the shadows.




Here it is brightened up so you can see all the details. It isn't necessary to even brighten up the image itself to see that detail, you can simply increase the gamma in the video settings for you operating system and the original image would look like the altered one. Of course, that's very impractical for print purposes as your monitor would be completely un-calibrated compared to any printer.

But perhaps what you mean is that you'd like to view the original image as it appeared in real life. The problem is, in real life light values can get extremely high. Just look at the sun. That's pretty bright. In order to properly display all the color values captured when taking a photo with the sun in it, your monitor would have to be capable of displaying light values that high. Even if it was possible for your monitor to get within 1/100th of that brightness level, you still wouldn't see all the details in the shadows because you'd be blinded by the light coming from the brighter areas, just as you would be in real life.

Maybe this all sounds like an exaggeration, but it really isn't. My point about seeing the difference between color 12,000,000 and 12,000,001 was that any monitor can display those two colors accurately, but the difference is so slight you can't see it. In order to actually be able to see that difference, the contrast between those two color values would have to increase dramatically.

Here's a demonstration of that:



This image has 4 colors in it. The two in the middle are a neutral shade of gray, and the same shade with 1 extra bit of red. The one on top is 40% brightness, the one on bottom is 60%




Here it is with the contrast turned way up. Notice that while the top and bottom colors have changed in value drastically, the ones in the center are still extremely close. This is the posterization you are saying you'd see on a high contrast, HDR monitor.
Corpsy is offline   Reply With Quote
 
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off



All times are GMT -5. The time now is 10:47 PM.