Lecture 3: Antialiasing (83)
danielhsu021202

Can you elaborate on why blurring the original image reduces the maximum signal frequency?

Boomaa23

I may have missed this in the lecture, but I didn't understand why sampling below the Nyquist frequency entirely prevents aliasing. If aliasing is just "false identity" and caused by a high frequency, then is the level of aliasing that is acceptable to a person subjective? Personally I feel like image 4 looks fine, though it is noted that image 5 is the first that should have "no aliasing".

jinweiwong

It's interesting that there is a tradeoff between aliasing and blurriness. I am excited to learn about techniques that modern cameras use to capture sharp images while avoiding aliasing artifacts.

litony396

@danielhsu021202 The way that I understand how blurring reduces the max signal frequency is that the blurring of a pixel makes the change in pixel color less sharp by averaging out the color in a pixel. Because of this, when you sample every 16 pixels, each 16th pixel that you sample has a higher chance of being strictly one color rather than a combination of colors. For example, a sampled pixel on the eye might be half white and half black which is very high frequency because of how different the two colors are. If you blur that pixel, then the color is more homogenized which is lower frequency information because there is a lower difference between colors in the pixel. Applying the blur to the whole image then makes the maximum signal frequency lower because less info is packed into every pixel.

Staffimjal

@danielhsu021202 Blurring the image is equivalent to a low pass filter, an example of this is on Slide 46, where the blurred spatial image is on the left, and its equivalent frequency domain image on the right. You can see that all of the high frequencies are black, which means they have been attenuated.

Staffimjal

@Boomaa23 I think slide 63/64 are good theoretical examples of why the sampling below the Nyquist frequency will entirely prevent aliasing. If you have a sinusoid that cycles every 32 pixels, and then sample every 16 pixels, there will be no aliasing. However, if the sinusoid cycled every 16 pixels and you sampled every 16 pixels, you will basically sample only the white parts of the sinusoid, therefore there's aliasing.

I do agree that image 4 looks acceptable, but if we look closely at the high frequency edges like the child's hair and the sun, its not the best representation of the underlying signal.

sjukurnael

When I googled "Nyquist Frequency", I came across one of its applications that I found pretty cool which is in digital audio processing. When recording sounds, engineers must sample at least twice as fast as the highest frequency they want to capture (the Nyquist rate) to accurately reconstruct the audio signal, so this can be used to make a clearer sound and better audio quality for the viewers.

myxamediyar

It is really cool to see the drastic differences in the quality of these images as they reach Nyquist frequency. I like how the professor demonstrated examples of pitfalls of sampling with higher frequencies (the strand of hair example) - really puts into perspective how amazing the world of signal processing is.

You must be enrolled in the course to comment