Lecture 15: Cameras & Lenses (92)
jayc809

We know that since a lens can only have one focal point and that only rays coming from a specific distance will perfectly be refracted and focused onto the focal point, we can create images like the ones on this slide where different parts of the same scene are blurred differently. This makes me wonder how cameras or phones implement their auto focus feature? Surely they cannot actually calculate the distance to the object to be focused, so do they trial and error until they find a focal length with the best clarity in the user-selected area? How will they know which direction to move towards?

colinsteidtmann

It surprises me that the image quality of the person doesn't increase when we blur out the background, I wonder what might be the cause of that.

zepluc

From my opinion, the clarity or quality of the person in the image isn't directly improved by blurring the background because the two are somewhat independent variables.

llejj

I think the only thing changed between the two images is the aperture size. We know that the circle of confusion is directly proportional to the aperture size, so the larger aperture has a blurred background. This should, in theory, cause the image quality of the person to increase, since we have more photons arriving on the sensor. I'm guessing they used a faster shutter speed in the left image, which is why the image quality looks the same.

You must be enrolled in the course to comment