If I'm remembering correctly from high school, iso was traditionally the sensitivity of the actual film itself. In the sense of a sensor (say a CMOS sensor), how is ISO actually set? Is it just another way of changing shutter speed (at which point why do we even bother with ISO? wouldn't one just use the exposure to control light input and thus noise?).
508312
Funnily enough adding this effect to the rendered image makes it more realistic.
caelinsutch
Slight digital grain is something that's actually added a lot during the editing process - I used to work in a few photo / film studios and when you have a perfectly lit image it can look "artificial" or rendered sometimes - especially in a studio environment. For CMOS sensors, ISO changes the gain of the actual photodetectors - just increases the sensitivity to light (and part of that is sensitivity to grain). It's done at the actual hardware level before the signal is even converted to a digital signal that's used when creating the image
el-refai
I8t is interesting how increasing ISO does also increase the noise. It does make some intuitive sense since by increasing the gain we're elevating the signal of everything including the background noise of our sensor. A really interesting thing you can do is look point a camera at complete darkness and increase the ISO and you'll start to see the noise get really accentuated as little red dots.
omijimo
if ISO is a purely digital process, would it be possible to reduce the noise with software or AI?
yangbright-2001
I once tried adjusting the ISO levels in my camera, seems a higher ISO makes the picture look brighter (in my own feeling), but it seems there is more noise in the photo, this slides is consistent with my experience
GarciaEricS
I think the best way to understand the phenomenon is through the signal-to-noise ratio. To take the photos with higher and higher gain, the photographers took photos of darker and darker environments. As such, the amount of light they were trying to measure (signal) decreased, while the amount of noise stayed about the same, so the signal-to-noise ratio decrease, so even after we reconstruct our image so overall the brightness is about the same, we don't get a good output because our signal-to-noise ration was too low.
saif-m17
@GarciaEricS I really liked your way of thinking about way noise levels were more apparently with higher gain. I was initially unsure of why that was the case and that explanation helped clear things up.
jonnypei
Does modern software automatically compute the "best" ISO gain for an image? Or is it fixed for a given camera based on its configuration/specs?
Liaminamerica2
ISO is a complex process as it can be accomplished in various ways using separate tools. It can be simulated via software or generated by limiting the light hitting the sensor. Some good functions to create the noise are Perlin noise and Poisson noise. Poisson noise is effective when simulating gain because the amount of light hitting the sensor follows a Poisson distribution.
If I'm remembering correctly from high school, iso was traditionally the sensitivity of the actual film itself. In the sense of a sensor (say a CMOS sensor), how is ISO actually set? Is it just another way of changing shutter speed (at which point why do we even bother with ISO? wouldn't one just use the exposure to control light input and thus noise?).
Funnily enough adding this effect to the rendered image makes it more realistic.
Slight digital grain is something that's actually added a lot during the editing process - I used to work in a few photo / film studios and when you have a perfectly lit image it can look "artificial" or rendered sometimes - especially in a studio environment. For CMOS sensors, ISO changes the gain of the actual photodetectors - just increases the sensitivity to light (and part of that is sensitivity to grain). It's done at the actual hardware level before the signal is even converted to a digital signal that's used when creating the image
I8t is interesting how increasing ISO does also increase the noise. It does make some intuitive sense since by increasing the gain we're elevating the signal of everything including the background noise of our sensor. A really interesting thing you can do is look point a camera at complete darkness and increase the ISO and you'll start to see the noise get really accentuated as little red dots.
if ISO is a purely digital process, would it be possible to reduce the noise with software or AI?
I once tried adjusting the ISO levels in my camera, seems a higher ISO makes the picture look brighter (in my own feeling), but it seems there is more noise in the photo, this slides is consistent with my experience
I think the best way to understand the phenomenon is through the signal-to-noise ratio. To take the photos with higher and higher gain, the photographers took photos of darker and darker environments. As such, the amount of light they were trying to measure (signal) decreased, while the amount of noise stayed about the same, so the signal-to-noise ratio decrease, so even after we reconstruct our image so overall the brightness is about the same, we don't get a good output because our signal-to-noise ration was too low.
@GarciaEricS I really liked your way of thinking about way noise levels were more apparently with higher gain. I was initially unsure of why that was the case and that explanation helped clear things up.
Does modern software automatically compute the "best" ISO gain for an image? Or is it fixed for a given camera based on its configuration/specs?
ISO is a complex process as it can be accomplished in various ways using separate tools. It can be simulated via software or generated by limiting the light hitting the sensor. Some good functions to create the noise are Perlin noise and Poisson noise. Poisson noise is effective when simulating gain because the amount of light hitting the sensor follows a Poisson distribution.