You are viewing the course site for a past offering of this course. The current offering may be found here.
Lecture 25: Image Sensors (87)
CardiacMangoes

Since it seems like increasing the shutter speed by 30x would have the same effect on SNR as taking 30 frames at the same shutter speed and averaging, will the end imaging be as sharp? Would there be any practical difference between the two images?

Zc0in

I think it should be different, since different shutter speed will bring different amount of photon. The noise may be similar when we take the average, but the images may be different.

joeyzhao123

Is this actually what is happening when we take images with our phones? I've always noticed that at night when I take a photo, my phone sort of freezes for a bit before the images appears in my album.

sha-moose

@joeyzhao123 this is basically what is happening. The camera takes a burst of images (the actual number of images and exposure time depends on how much you're moving your phone). It then aligns the images using ML and does a fancy averaging (plus tone mapping + auto white balance). For more information, you can check out the Google Night Sight paper http://graphics.stanford.edu/papers/night-sight-sigasia19/night-sight-sigasia19.pdf

rheask8246

This image shows exactly how noise reduction works with image averaging. This same technique is used in astrophotography. To reduce the noise in "empty" parts of the sky, and better pinpoint the visible radius of celestial objects, we have to average several shorter exposures together. A median average usually works better than a mean. This also gives us the added benefit of being able to "see farther" into space, as the noise is reduced.

You must be enrolled in the course to comment