One example of an aberration is chromatic aberration: https://en.wikipedia.org/wiki/Chromatic_aberration
This is caused by the refractive index of the lens varying with wavelength. In a renderer, though, we can make a virtual lens ideal and this problem wouldn't arise -- but if for some reason we wanted a chromatic aberration effect, we could separate rays into separate r,g,b rays that bend differently.
GKohavi
@dramenti To follow up, this actually causes a problem in computer vision tasks. There have been research papers where convolutional neural networks "cheat" by looking at chromatic aberration to solve self-supervised learning tasks with images.
julialuo
Sorry I might have missed this, but why are real optical lens systems so complex? Is it to try to mitigate the aberrations above? Can we assume that these complex lenses are like the ideal ones that converge all rays to a point?
Hsifnus
What other measures might be taken to minimize the effects of optical aberrations if they can't be entirely mitigated in real, compound lenses?
raghav-cs184
A nice way to model and build sensor systems from a lens model is through simulation software. One such program is Zemax that lets you put in a set of specifications and then optimizes lenses to come up with the right build. This can also let you do things like correct chromatic aberrations etc. EE118 provides a pretty good introduction to all these concepts in more detail!
rahulmalayappan
Chapter 6 of pbrt at http://www.pbr-book.org/3ed-2018/Camera_Models.html provides a great overview of more general camera models for ray tracing.
One example of an aberration is chromatic aberration: https://en.wikipedia.org/wiki/Chromatic_aberration
This is caused by the refractive index of the lens varying with wavelength. In a renderer, though, we can make a virtual lens ideal and this problem wouldn't arise -- but if for some reason we wanted a chromatic aberration effect, we could separate rays into separate r,g,b rays that bend differently.
@dramenti To follow up, this actually causes a problem in computer vision tasks. There have been research papers where convolutional neural networks "cheat" by looking at chromatic aberration to solve self-supervised learning tasks with images.
Sorry I might have missed this, but why are real optical lens systems so complex? Is it to try to mitigate the aberrations above? Can we assume that these complex lenses are like the ideal ones that converge all rays to a point?
What other measures might be taken to minimize the effects of optical aberrations if they can't be entirely mitigated in real, compound lenses?
A nice way to model and build sensor systems from a lens model is through simulation software. One such program is Zemax that lets you put in a set of specifications and then optimizes lenses to come up with the right build. This can also let you do things like correct chromatic aberrations etc. EE118 provides a pretty good introduction to all these concepts in more detail!
Chapter 6 of pbrt at http://www.pbr-book.org/3ed-2018/Camera_Models.html provides a great overview of more general camera models for ray tracing.