|
![]() |
|
LinkBack | Thread Tools | Search this Thread |
![]() |
#1 |
Junior Member
Join Date: Jul 2006
Posts: 15
|
![]()
Is it possible to fully display what is in RAW image (all those 14 bit or so)? There are already some high contrast displays, so where lies the problem? Is it only software issue?
cheers |
![]() |
![]() |
Sponsored Links |
|
![]() |
#2 |
Senior Member
Join Date: Jan 2007
Posts: 1,318
|
![]()
How often do you deal with your emai on a bit by bit level????
That is what RAW is and why so huge. There are reasons for it you can endlessly manipulate.... and yes you have to handle EVERY image 1 by 1.... it is after all a RAW not FINISHED product.... hense the name. Now some cameras like my Petax K10D will allow you to shoot RAW + JPG simutainiously..... JPG is good enough just trash the RAW.... or vice versa you have that ability to fix a lot you couldn't otherwise and ditch the JPG. |
![]() |
![]() |
![]() |
#3 |
Junior Member
Join Date: Jul 2006
Posts: 15
|
![]()
Hmm, somehow I can't figure out the answer to my question from your post...
We are getting rid of some information by converting from RAW to 8 bit formats so that we can see them on ours 'ordinary', 8-bit displays. Right? But what if someone has some HDR monitor capable of displaying simultaneously all those bits that are stored in RAW? That is my question - is it possible, what hardware/software would one need? Or am I confusing some things? |
![]() |
![]() |
![]() |
#4 |
Senior Member
Join Date: Aug 2002
Posts: 851
|
![]()
TIFF or PSD (and probably other file formats) can be 16 bits, and we look at them on our monitors.
I guess I do not understand your question. |
![]() |
![]() |
![]() |
#5 |
Senior Member
Join Date: Jan 2007
Posts: 956
|
![]()
By definition since the information is RAW, it always has to be interpreted prior to display, so the simple answer is no, RAW itself cannot be displayed without quite a bit of intervention by software. Contrast or other capability of monitors isn't the limiting factor.
|
![]() |
![]() |
![]() |
#6 |
Senior Member
Join Date: Apr 2003
Location: Eastern Ontario Canada
Posts: 823
|
![]()
I think you are asking if there are monitors that can display 48 bit colour as well as 24 bit?
I'm not sure about the monitors but I have heard of high end ink jet printers that will print 48 bit colour. |
![]() |
![]() |
![]() |
#7 | |
Senior Member
Join Date: Dec 2005
Posts: 879
|
![]()
K J wrote:
Quote:
When you take a RAW photo, the camera captures a range of light values as accurately as it can in 12 bit code (stored in a 16 bit container). That 12 bits is actually 12 bits per color channel per pixel, so 36 bits altogether. 12 bits per color channel means 4096 shades of red, 4096 shades of green, and 4096 shades of blue, or 68,719,476,736 different colors. When you view it on your monitor, it is downgraded to 8 bits per color channel per pixel, or 24 bits altogether. That's 16,777,216 colors. While that is far less than what the RAW contains, do you really think you could detect the difference between color 12,000,000 and 12,000,001? The extra data of a RAW file is for the most part redundant. It's there so that if you make drastic adjustments to an image, say by turning up the contrast 1000%, there's plenty of extra color information in there to prevent posterizing (visible separations in different color shades) and other problems. It also stores a lot more data in the shadow areas allowing for a lot of brightening when underexposed, and more precise color information for cleaner saturation adjustments. So ultimately, the point of RAW isn't that it's a better representation of the image. The point is that it is detalied enough to allow for significant modifications with far less quality loss than would happen with an 8 bit image. If you were to compare a perfectly exposed 8 bit image to it's 12 bit RAW counterpart, even on a monitor capable of displaying the entire range of RAW colors, it's unlikely you would see any difference whatsoever. |
|
![]() |
![]() |
![]() |
#8 | |||
Junior Member
Join Date: Jul 2006
Posts: 15
|
![]() Quote:
Quote:
Quote:
|
|||
![]() |
![]() |
![]() |
#9 |
Senior Member
Join Date: Dec 2005
Posts: 879
|
![]()
I'm not sure you're completely understanding what my point is. I'll try to illustrate more clearly.
On your point about 8 bit B&W, that is certainly an exception. 256 shades of gray, while usually enough for most purposes, is rather lacking. That's why a lot of people prefer to store their B&W images as 24 bit color images. Technically there are still only 256 shades of true gray, but adding minuscule amounts of red, green and blue can increase that number by thousands without visibly tinting the image. As far as being able to see all the shades of color, let me demonstrate something to you. ![]() Here is a very high contrast image with a lot of detail hidden in the shadows. ![]() Here it is brightened up so you can see all the details. It isn't necessary to even brighten up the image itself to see that detail, you can simply increase the gamma in the video settings for you operating system and the original image would look like the altered one. Of course, that's very impractical for print purposes as your monitor would be completely un-calibrated compared to any printer. But perhaps what you mean is that you'd like to view the original image as it appeared in real life. The problem is, in real life light values can get extremely high. Just look at the sun. That's pretty bright. In order to properly display all the color values captured when taking a photo with the sun in it, your monitor would have to be capable of displaying light values that high. Even if it was possible for your monitor to get within 1/100th of that brightness level, you still wouldn't see all the details in the shadows because you'd be blinded by the light coming from the brighter areas, just as you would be in real life. Maybe this all sounds like an exaggeration, but it really isn't. My point about seeing the difference between color 12,000,000 and 12,000,001 was that any monitor can display those two colors accurately, but the difference is so slight you can't see it. In order to actually be able to see that difference, the contrast between those two color values would have to increase dramatically. Here's a demonstration of that: ![]() This image has 4 colors in it. The two in the middle are a neutral shade of gray, and the same shade with 1 extra bit of red. The one on top is 40% brightness, the one on bottom is 60% ![]() Here it is with the contrast turned way up. Notice that while the top and bottom colors have changed in value drastically, the ones in the center are still extremely close. This is the posterization you are saying you'd see on a high contrast, HDR monitor. |
![]() |
![]() |
![]() |
Thread Tools | Search this Thread |
|
|