Steve's Digicams Forums

Steve's Digicams Forums (https://forums.steves-digicams.com/)
-   Memory Cards, Microdrives, Card Readers (https://forums.steves-digicams.com/memory-cards-microdrives-card-readers-51/)
-   -   Does memory card brand make a difference in picture quality (https://forums.steves-digicams.com/memory-cards-microdrives-card-readers-51/does-memory-card-brand-make-difference-picture-quality-18661/)

hst Jan 5, 2004 10:51 PM

Does memory card brand make a difference in picture quality
 
I have two different brands of 256K memory CF cards. One seems to record better images with less noise. Is that possible? or just coincidence. I don't see how it would matter but it does seem to. Both cards have been formated by the same camera.

eric s Jan 6, 2004 11:41 AM

No, the quality of the card should not effect picture quality.

But it will effect write times, which could mean the difference between getting the picture and not.

Eric

csd Jan 6, 2004 3:40 PM

The card brand/quality will not cause image noise. If the data is getting corrupted on the card - the image wouldn't be readable by your camera or PC.

voxmagna Jan 6, 2004 4:41 PM

Curiously I read somewhere, that if you've got a cam like mine where they put the battery pack right next to the media, then the media can get quite hot. But data is data and I can see no reason why what's written assuming error free, is no different to what's read back.

These cards all have cpu's in them, so unless the camera is picking up clock noise interference via the ccd sensor circuits and modifying the data before it gets stored or they have a high pulse load and again could be causing power line interference in the camera - I'm afraid that's my only guess. Put a 'scope across the batt. terminals and compare writes for 2 cards and see what's lurking about. Perhaps cheap cards miss out a bit of filtering and decoupling! VOX

csd Jan 7, 2004 8:39 AM

Vox,
Huh?
First off, there is no CPU in a CF card. It is simply nonvolatite memory.
Second, I find it hard to believe heat from a battery could cause any sort of problem. Most CF cards have a VERY large operating temperature - and vendors typically run environmental tests within these ranges. CF cards are used in equipment which gets MUCH MUCH hotter than a digicam. The ranges vary greatly (in fact in industrial CF, used in products such as data networking equipment, the range is -40 C to +85 C) - you can check a manufacturers web-site for their range. I've seen fan failures in commercial data comm equipment to the point where FPGA's and CPU's burn out, then pull out the CF and find it still works somewhere else.
Also, I find it hard to believe if you had power-line interference it would cause a series of bit-flips only enough to see some noise on the image - my guess is you wouldn't get a valid write to the card (and therefore - would be unable to read the image back).
Lastly, by the time the data is moved down to the CF A-D conversion has already taken place - so power line interface wouldn't cause an analog signal to be misread and misinterpret a bit.
I find it most difficult to believe this problem is the media.

csd Jan 7, 2004 9:09 AM

I just took a quick read through the CF standards. The standard includes a 32 bit polynomial Error Correcting Code capable of correcting mulitple random bit errors furthering my belief that this can't be the media.
hst - Are you sure there isn't another explanation for what you are seeing? Are you sure you are using the same level (or lack of) JPEG compression when using both cards? Are you shooting under similar conditions with all the same user settings on your camera when using both cards?
If you still think it may be the cards - the only other thing I would think to check is to see if the manufacturers post their bit-error rates on their sites (or query the manafacturers for their specifications). What are the manufacturer's and part numbers/names?

voxmagna Jan 7, 2004 5:53 PM

.......First off, there is no CPU in a CF card. It is simply nonvolatite memory............

csd - sorry but you need to do a bit of reading!. Now Smartmedia IS just non-volatile memory which is why it's so compatible in all respects and has no speed compatibility issues.

CompactFlash on the other hand, is nearly as complex processing wise as your hard drive - or did you think that was just a bunch of media coated platters! The CF host controller even has embedded software and some processor has to run it and execute instructions, so two cards from different manufacturers might have performance differences, whilst still adhering to the CF spec. I'd expect the pinouts to be the same though!

http://www.analog.com/library/analog...act_flash.html

It's hard to think of bit errors as responsible for a jpeg quality problem unless we're talking of failure to store data in the midle of the JPEG file data block. I've seen this happen with a correct file crc and black lumps missing from the pic. Encoding errors are more likely coming off the camera before hitting storage. If you are really keen try shooting two identical pics of something artificial and simple in Black & White using manual camera settings. Convert both pics to bitmaps. If you are lucky bitmap data may align reasonably well so you can look up the Windows bitmap file spec and peek around bytes near the same obvious picture transitions with an editor. If you see vast differences in the data level values between two different cards stoing the same image data, I'd be most surprised.

csd Jan 7, 2004 6:27 PM

The flash controller and buffering do not have embedded software. These are not considered CPU's by any means.
Processing is external to the card.

Two compact flash cards definitely could have performance differences - what we were talking about was whether or not they could create image noise. The ECC algorithm employed is standard and all AD is done before reaching the CF - so it is very unlikely they would make a difference.

csd Jan 7, 2004 8:22 PM

vox - let me add that I wasn't trying to flame you - although re-reading my original post it did sound that way. I was just sceptical of your suggestions. Sorry if I came off sounding harsh.

And although I don't necessarily feel this is the place to debate the difference between transistor-logic circuits versus embedded software running on PU's - I always seem to have trouble listening to terminology snafu's. I guess I'm anal in that regard. (As I said - I wouldn't consider a ECC chip/buffering CPUs - these are transistor logic circuits - not embedded software - there are no instructions and no processors for them to run on :) - whoops - sorry I'm doing it again).

okay - so lets put our BSEE/MSEE/PhD degrees aside and get back to photography.....

hst Jan 7, 2004 9:11 PM

Thanks for the replies
 
I tried shooting both RAW and in several JPG modes. The card that appeared to have the best picture was a Simpletech. The other card I won't mention the name because I don't want to be critical when it just may have been me. The other card seemed to right files faster and I could shoot faster sequences. It actually appeared to be a better performing card. But when I went to put them on the screen, the images just didn't look as good. I don't have the one that didn't appear as clear or I would do some testing with identical situations which I didn't do. I'm getting ready to buy another card and just wanted to know if the type made a difference in image quality. Thanks for all the input!

voxmagna Jan 8, 2004 4:02 AM

csd - ok point taken. It's often difficult to strike a balance between technical accuracy which we might know from our specialisations - to getting something across that may lack technical precision, but more easily understood by photographers. There is a misconception that compactflash memory is no more than 'static ram' in a square package, so that was the point I tried to get across.

Anyway, this is a curious post because non technical photographers are told that compact flash is electronic film. And if image quality for any reason can be shown to be different (which I'm sceptical about), then it's an important matter for all digital photographers.

.....The other card seemed to right files faster and I could shoot faster sequences. It actually appeared to be a better performing card.....

Yes I think we can all agree that card speed and possibly speed/compatibility/data error issues would fit this comment exactly but there's no rule that says the more you pay, the faster the card will get and perform faster in a particular camera.

......But when I went to put them on the screen, the images just didn't look as good.....

Hst, if you see the same problem again and can post links here to two image files, then there are plenty of helpful members on this Forum who will comment. Regards VOX

UrbanPhotos Jan 13, 2004 11:09 PM

Pictures aren't any better or any worse on different brands and types of media for the same reason that mp3s sound the same regardless of what brand of hard drive they're stored on. Digital data is either stored accurately or isn't. If the card does something wrong, the result is a totally ruined image, or one with garbled or miscolored sections. Any errors in storing the file will come across as obvious defects in the image, not subtle variations in quality.

JimC Jan 15, 2004 1:28 PM

I have seen image problems reported before with some brands of SD in some cameras.

Whether or not this has to do with bit error rates is speculation. It could even be a slight difference in the power draw from different cards -- if the voltage regulating circuitry in the camera did not compensate well enough for the card being used, and it impacted levels to the CCD or supporting chip set.

voxmagna Jan 15, 2004 6:14 PM

That's precisely what I was getting at earlier! The problem may be impacting outside the cards physical environment, at the time of saving.

I know my camera has enough buffer memory to allow me to take fresh shots whilst the card is still being written to. The problem with digital thinking and error rate analysis is you can miss the simple analogue power interference, power loading impedance or emc issues, but then my background includes RF. VOX

sjms Jan 15, 2004 6:29 PM

Quote:

The flash controller and buffering do not have embedded software. These are not considered CPU's by any means.
Processing is external to the card
i think you better rethink that statement.

as an example WA (write acceleration) programming in my Lexar Pro needed to be revised due to an error in its functionality with certain Nikon cameras. Lexar picked up the cards(2x1GB 36x pro cards) that i had reflashed the firmware in the cards and had them back to me in 2 days. same S/N's. so the cards can be a bit more sophisticated then one is led to believe today.

W/A is in 2 parts. part 1 in camera part 2 in the card and is an active function in the card with the appropriate camera/software interfacing with it.

no it is not a high level cpu function but still their controllers (which they design) are capable of higher then most cards functions and do have embedded and flashable firmware.

csd Jan 16, 2004 9:19 AM

Quote:

Originally Posted by sjms

i think you better rethink that statement.

as an example WA (write acceleration) programming in my Lexar Pro needed to be revised due to an error in its functionality with certain Nikon cameras. Lexar picked up the cards(2x1GB 36x pro cards) that i had reflashed the firmware in the cards and had them back to me in 2 days. same S/N's. so the cards can be a bit more sophisticated then one is led to believe today.
.

Okay - I'm rethinking as you requested... :lol:
Okay - here goes -
Not familiar with Lexar's manufacturing or parts, but more than likely they didn't reburn software (the parts I am familar with are hard-logic, not embedded) - they just swapped the controller or ECC chip with a newer version.

sjms Jan 16, 2004 9:48 AM

i dont think so. your trying to say the pulled open the CF packaging and replaced the chip. they are "throwaways" in a hardware sense. there are no hard repairs on these. it is too cost prohibitive. it would have cost less to just give me new ones. by the way their pro cards are flashable. just call them.

they had well over 1000 of these erratum cards to do. thats the advantage of flashable tech you can sometimes "fix it in the mix". making it less costly. ask intel about the pentium 4. its capable of it too.

csd Jan 16, 2004 11:28 AM

As I said - I'm not familiar with Lexar's parts but don't assume it costs them less to open up your card, hook up a prom burner, and re-burn a chip than to replace it. And, if they could be upgraded with a simple reader - you would have assumed they would have just sent you the firmware to do this yourself (similar to upgrading your digicam). It is often cheaper to remanufacturer than to reburn.

You (somewhat rude-ly) told me to "rethink" which is what I did for you. I've been involved in the engineering and manufacturing of many parts similar to these so was trying to add some value based upon experience. Vox and I already had this discussion and I believe it is outside the scope of this post.

The real point to this post was whether or not bit error rates could be great enough and occur at such a steady frequency such that it isn't corrected by the ECC algorithm and occurs at such a rate not to corrupt, but to add noise to the image, OR, whether some other type of RF interference can cause a corrupted pattern of bit flips such that it will appear as image noise. If you want to further discuss engineering and manufacturing methods - lets to that off line. For now - lets get back to the original question/discussion. If you have nothing else to add towards that goal - then please feel free to flame me elsewhere.

cczych Jan 16, 2004 11:39 AM

Quote:

Originally Posted by csd
The real point to this post was whether or not bit error rates could be great enough and occur at such a steady frequency such that it isn't corrected by the ECC algorithm and occurs at such a rate not to corrupt, but to add noise to the image, OR, whether some other type of RF interference can cause a corrupted pattern of bit flips such that it will appear as image noise. .

Hey - this sounds like a good topic for my master's thesis! We could simulate some bit errors - build a ECC algorithm (such as that employed by CF) - and feel through some images. If we get the right pattern of errors - who knows!

sjms Jan 16, 2004 12:10 PM

they simply plug it in a cf device and rewite the code. no prom burning. just a simple flash. they hold the code though they're not giving it out. its licenced and and marketed to camera mfgrs so we ain't gettin it. things can be so easily reproduced these days as you know

i apologize if i seemed rude in did not mean to "flame" you . it is just so many assumptions are taken here (on these forums) sometimes as fact and tend to be a bit more conjecture. improvements in products are made almost hourly in the chip world. again i apologize.

i myself tend to try to reference my answers when possible. it is part of my job to do so.

voxmagna Jan 19, 2004 3:12 AM

Just a quick departure from this interesting digital architecture discussion. Perhaps you may have picked up that a number of Fuji S7000 users in UK reported battery discharge problems with 'certain' cards.

Fuji issued a statement and identified the problem as due to certain cards having larger supply capacacitors on the card, with higher leakage current. So the amount of decoupling on a cards supply rails may vary from card to card.

................Hey - this sounds like a good topic for my master's thesis! We could simulate some bit errors - build a ECC algorithm (such as that employed by CF) - and feel through some images. If we get the right pattern of errors - who knows!................

and......... did you know that if you can read their CIS block on physical sector '0' you should be able to find out more about the masked errors. I'd like to do that on different vendors cards, use them for a while, do an smprep which re-calibrates the error table, and see how the numbers have changed. VOX

cczych Jan 21, 2004 10:34 AM

Quote:

Originally Posted by voxmagna


and......... did you know that if you can read their CIS block on physical sector '0' you should be able to find out more about the masked errors. I'd like to do that on different vendors cards, use them for a while, do an smprep which re-calibrates the error table, and see how the numbers have changed. VOX

It makes sense that they would record the corrected error rate. Definitley sounds like a statistic they wouldn't want to share with users though. Do you know of a practical way to read this block with off-the-shelf commercially available equipment?
Also, interesting note about capacitance variation amoung card vendors. There is an allowable voltage tolerance specified for the interface so I guess it follows that manufacturers at the higher end of the tolerance will require more juice and in turn drain more battery on each read/write. Since I tend to use my LCD display I would assume this difference negligible in my case but interesting never the less.

eric s Jan 21, 2004 12:03 PM

I guess I shouldn't have stopped reading this thread. It went in some interesting directions.

The original question was about different brands. Then it went into interference and other issues. I guess its possible that different brands could do a better job at shielding their parts, using good grounds and the like.... so in that respect different brands could make a difference. Since I don't believe that jpg has any ECC in the format (does any picture format? Wait, creatively thinkg... I guess they do if they are compressed, 'cause the uncompress would fail. But that is really a checksum, not an ECC.)

I would think the odds of interference making its way into the bit stream and not corrupting the picture would be small. But maybe I'm way off?

Eric

cczych Jan 21, 2004 12:40 PM

I agree - I think it is very unlikely although not impossible.

Can't comment on JPG compression having an ECC and/or a checksum - never read up on the algorithm. Not sure what other image formats have either. Thinking about this I guess we should see if JPG has a checksum though - because I have seen cases where a web browser only loads half an image (so either there are a series of checksums that occur for each block of data or ???) - and then I have run a packet capture utility to verify that the TCP stream was broken before all the data was received by my PC. So if the browser could display a partial JPG, either the decompression can ignore the checksum, or the checksum isn't across an entire image. In any case, based upon this case, I believe a browser decodes JPG in blocks, not in full so a checksum across the entire image can't be necessary for decoding. Time to find a good place to read up on JPEG compression....

voxmagna Jan 21, 2004 1:08 PM

..............I would think the odds of interference making its way into the bit stream and not corrupting the picture would be small. But maybe I'm way off?..................

Hey all, the problem with keeping your head in the digital world is you fool yourself into thinking 'cos the file checksum is ok everything's hunky: garbage in = garbage out!. That's the problem with pc thinking! Here's something I included in a post in the Fuji section earlier today. Nice of Polaroid to agree to a prog about their development of this camera, it was very enlightening and timely to see close ups of actual circuit boards and camera architecture explained by the development guys! James Kirk would be on home ground!


"Just by co-incidence, I watched a school prog yesterday, describing how Polaroid at their Scotland factory designed and debugged their new integrated (large!) digital printing camera. The main problem they had during prototype trials was the appearance of black lines on saved JPEGS, caused by pickup around the CCD processing. circuit. They solved the probs in the usual way with ferrite beads, inductors and caps added around the CCD processor chips. Regards VOX"

cczych Jan 21, 2004 1:31 PM

I may be stating the obvious but in this case the interference was caused during CCD capture. I believe it is far less likely that interferance would get added after capture - as in the case of transfer to, or from, a compact flash card which was the original subject of this discussion.

voxmagna Jan 22, 2004 7:35 AM

cczych........Do you know if a CF card is sitting dead in the camera slot with no power when the camera is on? I thought that since the camera controller was talking to the card, even in a ready state it must be digitally active all the time the camera is powered? VOX

cczych Jan 22, 2004 12:48 PM

I don't know the asnwer to that off-hand. Good point though - perhaps it is powered in the ready state whenever the camera is on - this would save time to power-up the device and initialize the interface everytime you take a picture.

eric s Jan 22, 2004 1:27 PM

That is a very interesting question. I don't know if its powere down. From my experience of flash memory programming, there is no reason to keep the flash powered (and probably a benefit of keeping them cooler) but the controler logic/chips are another matter. Maybe they need power.

I might know someone who knows something about this, I'll ask.

We are an inquisitive bunch, aren't we? Just the way I like it.

Eric

voxmagna Jan 23, 2004 6:56 AM

..............We are an inquisitive bunch, aren't we? Just the way I like it

Yes I agree, It's rather like David and Goliath. The Manufacturers are often economical with the truth and prefer to keep consumers in ignorance, whilst expecting us to part with the cash.

They, must also get quite fed up when bad feedback occurs and Internet hysteria sets in on new product launches. But then if they came round to being more open and realised that their marcomms methods can look decidedly ineffective, compared to credence of user feedback, they might realise being more answerable to the end user, at a reasonable speed, could be beneficial.

I'd definitely like to see more cameras with a usb/web firmware upgrade route. My PC, modems, software and MP3 players all have this, and whilst we accept new products are not always perfect - at least minor issues can be fixed without a return. Another idea I had was that a camera could be connected to a manufacturers site, then given a remote controlled diagnostic function test. This would be great for those buying second user cams or contemplating a warranty or store return. Would save money on wasted returns I'd have thought, and those posts from Newbies "My camera does this is something wrong?". VOX


All times are GMT -5. The time now is 11:18 PM.