Go Back   Steve's Digicams Forums > Software > Resizing / Interpolation

Reply
 
Thread Tools Search this Thread
Old May 14, 2007, 4:31 AM   #1
Junior Member
 
Join Date: May 2007
Posts: 1
Default

Hi there

I have heard some "experts" say that when interpolationg an image up, it's best to start with a higher bit depth image. So the advice is to convert the image from 8 bit to 16 bit before interpolating or save the image from the original raw file as a 16 bit tiff to begin with.

Frankly I don't see how bit depth makes a difference to interpolation.

Does anyone else have a view or any experience on this?
markhooper is offline   Reply With Quote
Sponsored Links
Old May 15, 2007, 11:09 PM   #2
Senior Member
 
Join Date: Jun 2005
Posts: 804
Default

No experience, my man! But, perhaps, to goad more knowledgable people into ponying up an answer, I'll just render an opinion.

On the face of it, it seems to me that the more information that an interpolation program has to work with, the more faithful it could be to the original data as it does its work.

Short 'n sweet.

Okay you printing aficionados, tell me how dumb I am! Ain't nothin' new, 'n I can take it!

Grant
granthagen is offline   Reply With Quote
Old May 29, 2007, 3:19 PM   #3
Senior Member
 
Corpsy's Avatar
 
Join Date: Dec 2005
Posts: 879
Default

granthagen wrote:
Quote:
On the face of it, it seems to me that the more information that an interpolation program has to work with, the more faithful it could be to the original data as it does its work.
That's pretty much it. If you work with more precise data, your modifications will calculate more accurately.

I'm not sure if this is true of every single camera, but for the most part any camera that can capture RAW and TIF files will capture and image that is 12 bits per color channel (saved in a 16 bit container in the case of TIF). A 12 bit image can accurately retain up to 68.7 billion color values, while 8 bit only saves 16.7 million.

That means that when saved as an 8 bit image, over 99% of of the colors need to be changed to something the 8 bit format can read. This means that a lot of colors that were very similar will either become the same color or become much more different, so transitions from one color to another become less smooth. If you need to interpolate the colors, 16 bit color depth would likely have had numerous color values to fit between the two, but 8 bit doesn't have any, so it just duplicates the existing colors which is what creates banding.

But I suppose the real questions are, how noticeable would the difference be, and how helpful is it to convert to 16 bit once the damage is already done? I rarely upsize my images, and when I do it's usually no more than 200%, so even when I upsize JPEGs I don't tend to notice much banding or other problems, but I guess I never tried converting to 16 bit first to compare. I'll have to experiment with that soon.

I do shoot a lot of RAW though, and when I want to upsize those I do it right in the RAW converter which just intuitively makes a lot of sense. I try to make as many adjustments in the converter as possible to ensure the best quality.

Anyway, I guess I've ranted long enough without actually answering your question. This isn't something I've ever thought about seriously so I guess I'll try it out soon. I'll let you know if I find anything useful.
Corpsy is offline   Reply With Quote
 
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off



All times are GMT -5. The time now is 7:39 AM.