Here’s my understanding of how ISO settings work in a digital camera. I am not an engineer, but I have done some work with analogue-to-digital conversions.
With each change in ISO, there is no change in the ‘sensitivity’ of the CCD (or CMOS in the case of Canon SLRs) sensor. The sensor captures light as a low voltage analogue signal. This information is converted to a digital signal by the camera’s analogue-to-digital (a-t-d) converter. With a 100 ISO setting the signal is likely not amplified to any significant degree, as this is often the base sensor level of the camera. With each increase in ISO selection the sensitivity of the sensor does not change, but the analogue signal is amplified (increased before a-t-d conversion). Like any analogue signal, any ‘noise’ in the signal will increase along with the signal as the signal is amplified. The cause of the original noise is anyone’s guess but could be anything from defective sensors, heat, electrical interference, etc, etc. Something that is virtually ‘invisible’ at low amplification becomes quite visible at higher amplification. It’s not that the noise was not present in the original signal, it’s just that it has been multiplied through the amplification process. In other words, the higher ISO settings have higher noise because the original analogue signal (including the noise) has been amplified to a higher degree.
And yes, the camera manufacturers do try to ‘match’ the ISO levels of the digital cameras to that of the various film ISO settings. That is, the level of signal amplification for a given change of ISO setting in a digital camera should roughly equate to the change in light sensitivity experienced with traditional film for the same ISO change.
The good news is that higher-end sensors and a-t-d converters are producing higher signal-to-noise ratios (i.e. more signal per unit of noise, a good thing) which will eventually work it’s way down into the mass market cameras.
Hopefully this sheds some light on the picture...