The filter filters out some of the high frequency informations to avoud aliasing, which is similar to what we learnt in mipmap.
colinsteidtmann
I think it works because high-frequency details in an image can cause aliasing, where the image appears to have patterns or distortions that are not present in the original scene. By splitting and thus blurring these high-frequency details, the birefringent material helps to prevent aliasing from occurring in the final image.
ttalati
I am still quite confused why splitting blurs the high frequency details, is it that each sensor gets rays from multiple peripheral filters or is it that now the different RGB sensors get a mixture of the light that travel between the different wells?
dhruvchowdhary
It's cool how the anti-aliasing filter uses a layer that splits light to avoid weird patterns in our photos. It's like the filter is gently smudging the tiny details so that the camera doesn't pick up stuff that can make the image look strange. You have to get the balance right so that the picture is smooth without losing too much detail.
sebzhao
This is really cool to see low-pass filters implemented using real materials rather than using an algorithm over the pixels! Apparently these were really popular in cameras in the 2000s, 2010s. I wonder why they are no longer used.
Mehvix
@sebzhao I believe Professor Ng mentioned during lecture that, as sensor resolution/density increases (and with it the camera's sampling frequency), the strength of low-pass filter should be decreased to not over-blur the signal
s3kim2018
I think demosaicking in general can prevent some level of aliasing as it downsamples an image, forcing RGB values to be interpolated in every other pixel. However, I would assume that the effects are very minuscule and special antialiasing filters would be needed.
The filter filters out some of the high frequency informations to avoud aliasing, which is similar to what we learnt in mipmap.
I think it works because high-frequency details in an image can cause aliasing, where the image appears to have patterns or distortions that are not present in the original scene. By splitting and thus blurring these high-frequency details, the birefringent material helps to prevent aliasing from occurring in the final image.
I am still quite confused why splitting blurs the high frequency details, is it that each sensor gets rays from multiple peripheral filters or is it that now the different RGB sensors get a mixture of the light that travel between the different wells?
It's cool how the anti-aliasing filter uses a layer that splits light to avoid weird patterns in our photos. It's like the filter is gently smudging the tiny details so that the camera doesn't pick up stuff that can make the image look strange. You have to get the balance right so that the picture is smooth without losing too much detail.
This is really cool to see low-pass filters implemented using real materials rather than using an algorithm over the pixels! Apparently these were really popular in cameras in the 2000s, 2010s. I wonder why they are no longer used.
@sebzhao I believe Professor Ng mentioned during lecture that, as sensor resolution/density increases (and with it the camera's sampling frequency), the strength of low-pass filter should be decreased to not over-blur the signal
I think demosaicking in general can prevent some level of aliasing as it downsamples an image, forcing RGB values to be interpolated in every other pixel. However, I would assume that the effects are very minuscule and special antialiasing filters would be needed.