Go Back   Steve's Digicams Forums > Digicam Help > General Discussion

View Poll Results: which mirrorless format would you prefer?
FF 2 22.22%
APS-C 4 44.44%
4/3rds 1 11.11%
1/1.6" 2 22.22%
1/2.5" 0 0%
1/3" & smaller 0 0%
Voters: 9. You may not vote on this poll

Reply
 
Thread Tools Search this Thread
Old Mar 22, 2009, 2:57 PM   #21
Banned
 
Join Date: Mar 2009
Posts: 43
Default

" They do not care (at this time) to compare their images with those taken with larger sensor cameras."

I disagree with this. Anyone who takes a picture can't help to compare the IQ to what they've seen before and will see in the future, much of which will come from larger-sensor cameras. Perhaps there is a day and an hour where they are not interested in this, but it's inevitable.

One must reasonably conclude that the most significant things to people who buy low-end p&s's and small-frame gear are price, size & weight. Doesn't mean that that has always been the case or always will be the case for those same shooters. And I don't care *what* gear you buy, you're going to suffer to shoot jpegs and that's in the bag for every digital shooter no matter what gear they have. Sooner or later you're going to crank up the ISO on your camera and inevitably conclude that high ISO=low IQ. The correlation between IQ and ISO with increasing sensor size is clear to all who care to delve into such things, but IQ is right there in front of your face.


touristguy87 is offline   Reply With Quote
Old Mar 22, 2009, 3:23 PM   #22
Banned
 
Join Date: Mar 2009
Posts: 43
Default

"What resolution do we really need for prints? Ctein concludes that for a "perfect" 8x10 print we will need 400mp. So even for people like me who print predominantly at 8x10 we are a long way short of perfect. Although I do think that for A4 and even A3 work M4/3 is "good enough" for 95% of applications."

My 17" diagonal laptop monitor is 2MP, 16:10 and 72dpi, and on it, images look just fine...in a format that's significantly larger than an 8x10" print.

Why do I need a 12MP print much less a 400MP print. What's the goal?

Where is the quality advantage?

Why print pixels that are so small that I have no hope of seeing them?

One can only conclude that it is to drive-up the price of printers and cameras.
It's to give people something to do with their 9600dpi scanners. Either I should be able to print large-format at 72dpi just fine as well, or else I would need to scale-up the print-resolution along with the format. I know pros who print 20meter panos at 50dpi, with 700MP images. I doubt that I need a 400MP image for a "perfect" 8x10" print. I don't even need 12MP. I need a printer with a decent fill-factor. 100dpi should be enough.

The reason that we need 300dpi & up is that the printers we print on have such low fill-factors, and if you are only putting 300 dots in an inch, that leaves 300 spaces in that inch, too, and it's easy to see that when you're holding the print. Raise the fill-factor and that problem goes away. In fact the higher the resolution of the printer the more that this is a problem because the fill-factor has to go down with increasing resolution. A dot that works at 9600 dpi will have an absymally-low fill-factor at 300dpi. All you're doing by driving up the resolution in the printer is driving up the resolution required to fill the image, but beyond a certain point you can't even see the improvement in print-resolution. So it's "perfect" in what sense? It's the perfect balance between the ability of some mfg to make and print an image, and their manufacturing-cost?
touristguy87 is offline   Reply With Quote
Old Mar 22, 2009, 4:16 PM   #23
Banned
 
Join Date: Mar 2009
Posts: 43
Default

"If panasonic invests anywhere near what nikon and canon spend on marketing and distribution, 4/3rds marketshare will grow."

I have to agree with this if only because the people who don't know what they are buying are buying mainly on reputation and half-information. The experienced, knowledgable consumer not only has most of the gear that they want, they're not going to let themselves get gouged on the price or performance.
The big money, the easy money, is in selling to people who have no clue what they're buying, but they are desperate enough to buy it anyway.

I don't think that *Nikon* really cares about this market (they are more interested in taking Canons' market-share at the high end) and I doubt that Canon has anything really interesting to put in it (and if they did, Panasonic, Fuji, Sony & Samsung would eat into that).

The mfg I see taking it on the chin is Canon. They have the most to lose and the weakest defense. Really the only products of any significance that they have released lately are the 5DMk2, G9 and G10 and that's not saying a whole lot. Conservative, predictable...they will keep the Canon name out there, but not do much to extend their technological lead, and Panasonic has already caught them with the LX3 and the G1-H has to be matching the EOS line camera for camera. I can't see anyone in their right mind buying a Rebel or D60 now, likewise the A700, the 40D...why, for a halfway-decent ISO1600? For an unusable ISO6400? But yeah, these are real specs on the side of the box.
touristguy87 is offline   Reply With Quote
Old Mar 23, 2009, 4:35 AM   #24
Super Moderator
 
peripatetic's Avatar
 
Join Date: Nov 2004
Posts: 3,599
Default

************ wrote:
Quote:
"What resolution do we really need for prints? Ctein concludes that for a "perfect" 8x10 print we will need 400mp. So even for people like me who print predominantly at 8x10 we are a long way short of perfect. Although I do think that for A4 and even A3 work M4/3 is "good enough" for 95% of applications."

My 17" diagonal laptop monitor is 2MP, 16:10 and 72dpi, and on it, images look just fine...in a format that's significantly larger than an 8x10" print.

Why do I need a 12MP print much less a 400MP print. What's the goal?

Where is the quality advantage?

Why print pixels that are so small that I have no hope of seeing them?

One can only conclude that it is to drive-up the price of printers and cameras.
It's to give people something to do with their 9600dpi scanners. Either I should be able to print large-format at 72dpi just fine as well, or else I would need to scale-up the print-resolution along with the format. I know pros who print 20meter panos at 50dpi, with 700MP images. I doubt that I need a 400MP image for a "perfect" 8x10" print. I don't even need 12MP. I need a printer with a decent fill-factor. 100dpi should be enough.

The reason that we need 300dpi & up is that the printers we print on have such low fill-factors, and if you are only putting 300 dots in an inch, that leaves 300 spaces in that inch, too, and it's easy to see that when you're holding the print. Raise the fill-factor and that problem goes away. In fact the higher the resolution of the printer the more that this is a problem because the fill-factor has to go down with increasing resolution. A dot that works at 9600 dpi will have an absymally-low fill-factor at 300dpi. All you're doing by driving up the resolution in the printer is driving up the resolution required to fill the image, but beyond a certain point you can't even see the improvement in print-resolution. So it's "perfect" in what sense? It's the perfect balance between the ability of some mfg to make and print an image, and their manufacturing-cost?
Eh?

The print drivers do fill the gaps. Epson for example suggest that for resolutions down to 180ppi you should not upsample an image in photoshop, but let their drivers do the interpolation or "gap filling". Below 180 dpi however they cannot interpolate well enough.

It's not important that you doubt that you need 30lp/mm to get a "perfect" print. That's what the data show. People CAN see the difference in tests between high and low resolution prints. The point at which they stop seeing those differences is at around 30lp/mm or 400mp in an A4 print. Read the Ctein article again. At 5 lp/mm people start to think the print looks "good", from 5 to 30 it looks better and better and beyond 30 they cannot see any more improvements.

Of course for very large prints you only need lower dpi output because of viewing distance. But that is not a "standard print". How is it that you don't know this stuff? See:

Ray, Sidney F. 2000. The geometry of image formation. In The Manual of Photography: Photographic and Digital Imaging, 9th ed. Ed. Ralph E. Jacobson, Sidney F. Ray, Geoffrey G. Atteridge, and Norman R. Axford. Oxford: Focal Press. ISBN 0-240-51574-9

Computer monitors don't need such high linear resolution because of their high spacial resolution; they can display 16.7m colours in each pixel. Additionally their extra contrast makes them require even less resolution to get the same viewing experience. It's also no longer the case that all monitors use 72dpi. Many monitors now use 96dpi as their standard resolution and we will see that rise to 120dpi over the next few years. When viewing images on the higher resolution monitors you can certainly see the difference.

The quest for higher resolutions is not driven by some conspiracy to make cameras and printers more expensive, it is driven by the search for extra quality that differentiates products in the market place. Most consumers don't give a hoot about the specifications, but when you put a high-res image next to a low-res one the differences are so stark that everyone can see them and is willing to fork out the cash for the extra quality. As long as the consumer can see those differences and is willing to pay for them the manufacturers will continue to push for higher and higher resolutions. At some point, probably in the not-too-distant future (10-20 years?) we will reach the point where resolutions are high enough that we can no longer see any improvements.

But you do point to an interesting development that one sees in galleries, and which have implications for ones own choice of technologies. If for example you don't want to carry around large expensive camera equipment but still want your exhibits (be they at home or in galleries) to look fantastic you do have the option of shooting at a lower resolution and displaying digitally.

For example - for the last few years, the finalists in the BBC Wildlife Photographer of the year competition have all been displayed on light-boxes. The displays look great, galleries have to work far harder to get their lighting right to display prints that look as good.

http://www.nhm.ac.uk/visit-us/whats-...hibitions/wpy/

Prints need the high resolution, digital displays don't so if you want to work at lower resolutions then by displaying digitally your work can still look great.

But a lot of people still like to hang prints on the wall. To purchase screens to cover all the display area that I use would cost a fortune. Much cheaper to buy a higher resolution camera. :-)
peripatetic is offline   Reply With Quote
Old Mar 23, 2009, 11:18 AM   #25
Banned
 
Join Date: Mar 2009
Posts: 43
Default

I could go into this in high detail but let's just stick with this for the moment:

First of all, the fact is, you have a computer monitor right in front of you, you look at your images mostly on your monitor not on prints. I don't know about yours, but mine is 17" diag, 16:10, 2MP. 72dpi. And the images look fine. So that's my baseline: I need 2MP of image data to see a good image on my display, and it's going to be displayed at 72dpi.

Next, it's not even displaying true pixels, they are rgb triplets. Unlike a CRT, the colors are adjacent not overlapping. So that means that "72dpi" is low enough at about 1-2ft viewing distance for my eyes to take care of blending the colors. Now. would it look "better" if it was a 96, 120, 240, 300 400 dpi monitor? Sure. Tehnically and subjectively. But what would be the quantitative improvement in perceived IQ?

Would I pay proportionally-more for a higher-resolution monitor? No! I wouldn't pay 3x as much for a monitor with 3x the DPI but the same image-properties otherwise, requiring 9x as much image-data to fill the display. If my monitor was 240dpi instead of 72, it would require a 12MP image to fill the display.

Take a print made on the printer & paper of your choice at 3x the linear resolution of your monitor, display the same image on your monitor, and tell me if you can perceive a difference in IQ and how much. First and foremost. Start with that.

Now when you say that print requires higher resolution than a LCD or CRT, what exactly does this mean?

"Computer monitors don't need such high linear resolution because of their high spacial (sic) resolution; they can display 16.7m colours in each pixel. "

That's color resolution, not spatial resolution, and this is not just wrong, it's the opposite of the situation. Computer monitors do not provide high spatial resolution, relative to prints.

"Additionally their extra contrast makes them require even less resolution to get the same viewing experience. "

Contrast is clear..."to get the same viewing experience" is a subjective matter. If your point is that prints are one thing, displays another, the point is accepted. Why the "viewing experience" differs on each is the heart of the question.


"It's also no longer the case that all monitors use 72dpi. Many monitors now use 96dpi as their standard resolution and we will see that rise to 120dpi over the next few years. When viewing images on the higher resolution monitors you can certainly see the difference."

...again, how do you quantify that.
Especially in the context that larger displays generally have lower linear resolution. Has the trend been towards smaller displays with higher linear resolution, or away from them? What you're talking about are future displays that will have higher linear resolution...and might even match the linear resolution of smaller, older displays. But it's clear to see that we are talking about new 3-5MP displays that are not even beginning to keep pace with the MP race in cameras.
Not to mention the output capabilities of the PCs that are using it, 1080p isn't going to provide 3MP for these monitors. You're going to need 2-4 1080p inputs into these 3 & 5MP monitors. And that's still in 8-bit/channel color.

How does it possibly even make sense that I need 300dpi for a print when I have all this display hardware that's not even close to that, struggling to achieve a fraction of this? If you can't tell me this *quantitatively* all of the subjective excuses are clearly tainted with simple profit. Show me a market where people are willing to pay 9x as much for 3x the linear resolution. If I can fill the display on my laptop with the output of a 3MP camera, what do I need 24MP for? Just for printing? Even ignoring the proportion of print that I will produce vs images that I will view on my display, that's almost 9x the image data, almost a factor of 3 increase in linear resolution, yes? Assuming all else equal, am I going to get a substantially-better image on my *display* if I use a 3MP p&s vs a 24MP A900 or D3X? Yes or no?

And if not, why does this suddenly become a factor when I print?

That in and of itself is my question. Sure, the 24MP of image-data has the potential for much higher IQ IN A PRINT. Try the test above. Hold such a print up next to the same image on your computer monitor and tell me just how much "better" it looks. The fact is that you can throw in all the complicated analysis that you want. If you cannot tie it to subjective perception it's nothing better than meaningless abstract analysis. And that, my friend, is the basis for large profit, and large losses. When people start to trade items based on subjective concerns, a lot of money is made and lost in such trades, and the digital-camera market is exploiting the perception, the subjective perception, that you need high MP to get high IQ. That's simply not the case. It is absolutely not true. No matter what there should be some optimum resolution in terms of price/performance (not just "performance"), for a given viewing-distance. The monitor that you are reading this on is telling you what that is. You can't explain away the difference between displays and prints through contrast, brightness & resolution. Any goob can crank up the contrast, print on reflective paper & put a light on their prints. If that 300dpi print at 8x10" looks *that* much better than it does at 16x10" on my 72dpi display at 2MP, there are serious issues with the display. But how would you really understand how serious they are unless you do a *lot* of printing? And can you quantitatively tell me that it is really worth the loss in SNR, not to mention the extra storage? Now if you *don't* do a lot of printing, which I can safely say that most people don't do, it's *not* going to be worth that.

You can spend an infinite amount of time and money pursuing "perfect IQ". I think that for most shots taken by most shooters the optimum was passed around 3-5MP. And I can prove that analytically. By the sheer fact that the overwhelming majority of PC displays are 1-2MP. That we don't talk about "2160p". That a 1MP image will take up 50% of a 2MP display, and if your display is 4:3 or 3:2, that's fine. When people begin to throw their 2MP displays out the window, I will believe that we need cameras with maybe 5 MP. The only people who need these cameras with 8MP & up are people who literally *need* that resolution (and don't have time to get it by shooting panos).

And the main reason they need it, by far, is for printing.

Thus the printing-process is the problem.

Paradoxically we have mfg producing cameras with higher & higher MP and selling them to people who are doing less & less printing. You can't keep increasing both MP and ISO forever.
touristguy87 is offline   Reply With Quote
Old Mar 23, 2009, 1:28 PM   #26
Banned
 
Join Date: Mar 2009
Posts: 43
Default

...second I think that if you really want pixels and you want the highest IQ, you can always shoot film and either scan it with a film scanner or enlarge directly.

For "the best IQ", digital will always come in second for at least 3 reasons, Bayer-blur, tone and DR. So that argument is self-defeating. The question becomes "what is the best digital output that you can get" and that question is not relevant to most shooters.

In any case they would still get better results through good shot-selection & good technique, than by shooting at higher MP. And there's clearly a point where high MP results in lower IQ.

Last but not least, when your computer is chock-full of high-MP images to the point where you can't even put new shots on it, obviously high-MP is not contributing to IQ.
touristguy87 is offline   Reply With Quote
Old Mar 23, 2009, 4:37 PM   #27
Banned
 
Join Date: Mar 2009
Posts: 43
Default

case in point:

http://www.imaging-resource.com/NEWS/1237833671.html

$50 for a standalone 1080p image-player.

How long before people seriously begin to wonder, why all those large image-files?


touristguy87 is offline   Reply With Quote
Old Mar 23, 2009, 4:53 PM   #28
Banned
 
Join Date: Mar 2009
Posts: 43
Default

"The print drivers do fill the gaps. Epson for example suggest that for resolutions down to 180ppi you should not upsample an image in photoshop, but let their drivers do the interpolation or "gap filling". Below 180 dpi however they cannot interpolate well enough."

I think you're missing my entire point.

You should be able to print at will without worrying about whether the printer "will" do "gap-filling" because the printer/driver should do it anyway. If that would happen, then you wouldn't need to worry about high-res sources to get decent output. You should at least be able to match the IQ of what you see on your display, with the same dpi in a print.
touristguy87 is offline   Reply With Quote
Old Mar 23, 2009, 8:45 PM   #29
Banned
 
Join Date: Mar 2009
Posts: 43
Default

"A lot of people don't. I do. So do a lot of others. People who really don't care about prints right now generally don't own "proper" cameras, they just use their camera phones."

That's nonsense.

touristguy87 is offline   Reply With Quote
Old Mar 23, 2009, 9:23 PM   #30
Banned
 
Join Date: Mar 2009
Posts: 43
Default

this is silly...

http://theonlinephotographer.typepad...be-enough.html

"The gulf is huge. Most viewers consider an 8 x 10-inch print to be reasonably sharp when it conveys 3–5 line pair per millimeter of fine detail. But, a print won't look perfectly sharp until it conveys around 30 lp/mm. That is, if you put a 15 lp/mm print next to a 30 lp/mm print, a high percentage of viewers will select the 30 lp/m print as being sharper, although most of them won't be able to tell you why they did. But, if you put a 30 lp/mm print next to a 60 lp/mm print, they won't be able to see any difference."

Sure, all else being the same (yet still optimized for the higher resolution print), a print at higher-resolution will appear "sharper" than a print at a lower resolution. The problem is why would anyone ask this question?

Does the author think that if he takes picture A printed at low res and places it next to picture B printed at high res, that most people will select picture B? Otherwise what is the point of this question? He is inflating the issue. Sure, print quality matters but there are other ways to provide it than by throwing a bunch of pixels on a screen. The question should be "what resolution is high enough to assuage the resolution-based concerns of most viewers?". Likewise, that also is a function of the print *process* as well as of the image itself. Arbitrary abstract numbers mean nothing. All is context-dependent.

"Do we actually need 400 megapixels? I honestly don't know; I haven't run the experiments."

Come on, if you have any hesitation about the answer then clearly you don't need 400MP. If you don't know if you need it, how could the answer be yes, logically. The same goes for any fraction of 400MP. If you don't know that you need it, then logically the answer is "no".

That is the crux of the problem here. How do camera and printer mfgs convince the public to buy high-MP products if they don't know if they need them?

The answer is to convince them that they cannot generate high-quality images if they don't have high-resolution cameras & printers. And that if they MIGHT ever want a high-resolution image, they NEED to shoot with a high-resolution camera ***all the time***. Or even most of the time. In any case they should buy gear that is *capable* of high-resolution output even if most of the time they don't need high-resolution shots. Otherwise the only people who would buy such gear would be the ones who know that they need it. It doesn't hurt to make the gear incapable of producing high-quality output at low resolutions. And of course, at low prices. It also helps if the consumer is illogical (not to mention ignorant) and doesn't understand the situation presented to them by the manufacturers. It helps *very* much if you fill the Internet with nonsense about 400MP images and then sell the consumer 10-24MP gear.

It also helps very much if you fill the Internet with disinformation like "you can't generate image-data through interpolation" even if you don't need more image-data to make a high-quality print at a larger format using these high-resolution printers. But sooner or later everyone who prints images off a PC is going to realize two things. First, that no print is perfect. Second, that it looks pretty good on their display, and their display is far-lower resolution than their "high-quality" print. And the question is naturally going to follow: "if 2MP is enough image data to produce a good-looking image on my display, why isn't it enough to produce a good-looking print at the same size as my display, and if I *could* generate a good-looking 2MP print that's the same size as my display, why the heck would I need a 12MP camera?"

Especially when they can't even print an image that's as large as their display.

Not to mention when they begin to read about the effect of pixel pitch on noise. And begin to shoot raw, in order to maximize IQ. It seems clear that the more that they learn about digital imaging, the more this question is going to nag at them. And eventually they will simply stop printing altogether. Especially when their customers start to ask them for image-files and not for prints.

I mean, what else could be the driving-force behind craw and sraw except for this? Even Canon had to admit that quite often the people that buy their DSLRs don't want such high-resolution images, even if they want to shoot raw. And if I'm happy with 6MP raw files from a 20MP fullframe, I'd be *ecstatic* with 6MP raw files from a 6MP fullframe.

And face it it's inherently nonsensical to make & sell a DSLR with comparable resolution to a p&s that costs far less and is a fraction of the size. My G9 has the same resolution as a D700. Is the D700 really worth 6x as much without a lens, 7x as much with an IS superzoom to match the range & speed of the G9? It would only be worth it for shots that the D700 can take -and take well- that the G9 cannot take with suitable quality (and this ignores the question of whether those shots are worth taking). And the A900, with a mere 40% improvement in linear resolving-power over the D700, is even *more* expensive but even more noise-limited than the D700. This is why I predict that the future of the digital market will be in *lower* resolutions, not *higher* ones. Sooner or later some mfg is going to come clean and say "you know what? I can give you better overall IQ, IQ that you can really make use of, with a low-resolution camera, than you can get with a high-resolution camera, not only that but it'll cost you a lot-less money and reduce your storage-requirements significantly".
touristguy87 is offline   Reply With Quote
 
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off



All times are GMT -5. The time now is 4:31 AM.