It's worth noting that rolling shutter mostly occurs on CMOS sensors. CCDs (charge-coupled devices) are alternatives to CMOS sensors and are generally more sensitive and more expensive. CCD cameras often use global shutters, which take a snapshot at a single instant in time and thus do not have any motion artifacts from rolling shutter.
jeromylui
Echoing what was said in lecture, rolling shutter artifact is caused when pixels at the beginning of the process absorb light from an earlier point in time than in the end. So even though the pixels have the same exposure duration due to the stagger reset of pixels, it leads to two different moments in time captured at different locations in the same image.
knguyen0811
As mentioned in lecture, the process described in this slide is based on the photoelectric effect, where light strikes the sensor and releases electrons that can be read out. This process also occurs when shooting in low light situations with image intensifiers, where released electrons are amplified before they are read out, giving more detail than ordinary cameras.
Michael-hsiu
I wonder if rolling shutter artifact can be reduced by using a clustering algorithm based on past data to find regions on the sensor with similar colors, and resetting pixels that best represent each "group" of pixel colors first. Maybe while the shutter is open, pixels are randomly selected to be read at a given timestep and past pixel read data is kept for each pixel to inform the clustering algorithm.
It's worth noting that rolling shutter mostly occurs on CMOS sensors. CCDs (charge-coupled devices) are alternatives to CMOS sensors and are generally more sensitive and more expensive. CCD cameras often use global shutters, which take a snapshot at a single instant in time and thus do not have any motion artifacts from rolling shutter.
Echoing what was said in lecture, rolling shutter artifact is caused when pixels at the beginning of the process absorb light from an earlier point in time than in the end. So even though the pixels have the same exposure duration due to the stagger reset of pixels, it leads to two different moments in time captured at different locations in the same image.
As mentioned in lecture, the process described in this slide is based on the photoelectric effect, where light strikes the sensor and releases electrons that can be read out. This process also occurs when shooting in low light situations with image intensifiers, where released electrons are amplified before they are read out, giving more detail than ordinary cameras.
I wonder if rolling shutter artifact can be reduced by using a clustering algorithm based on past data to find regions on the sensor with similar colors, and resetting pixels that best represent each "group" of pixel colors first. Maybe while the shutter is open, pixels are randomly selected to be read at a given timestep and past pixel read data is kept for each pixel to inform the clustering algorithm.