I have always known that your eyes can see better in the dark at night, but I never realized that the range of colors you see are different too. I understand that part of Professor Ren's research is to create artificial colors, and I wonder if part of this involves manipulating how our rods and cones react to light.
zachtam
Does this mean that luminous flux depends on the surrounding lighting conditions, since V(lambda) would vary? This seems like it would be hard to deal with, as if you have a bright enough light it could trigger the eye to partially change its adaptation state?
KevinXu02
How can the difference in the range of colour that people see be used in cg industry? Will this be considered in the rendering pipeline as the brightness of monitor various?
SuryaTalla22
I wonder if this trend has to do with the fact that higher wavelengths of light have less energy, so your eyes are trying to balance out the extra light by being less receptive to higher energy frequencies?
ElShroomster
Does this mean that to make a realistic game, I should change the color of light emitted on top of decreasing the brightness of light sources or would this be automatically taken care of based on the scenario by our eyes?
lycorisradiatu
@jinweiwong In terms of how our rods and cones react to light, I believe rod cells are highly sensitive to lights with lower wavelengths but are not sensitive to color. As a result, scotopic vision provides monochromatic vision in dark environments. In contrast, cone cells are responsible for color vision and high visual acuity.
keeratsingh2002
How does the photometric measurement of light impact the design of lighting in virtual environments, where the goal is to mimic real-world lighting conditions as closely as possible?
AlsonC
I really enjoyed learning more about how my eyes adapt. I knew before that my eyes probably adjusted to darker environments, which is why my eyes were always so sensitive when I finally left my room and was in the sun. However, I never knew that this changed the color spectrum as well.
myxamediyar
I agree with jinweiwong. I never knew that the perceived color spectrum changes based on the surrounding light. I guess I naively thought the brain had some processing units for lights of different frequencies, but I guess it might be the eye that works as the initial filter... Interesting! More reason to take cogsci classes haha
danielhsu021202
I guess this makes sense if we're considering that our brain and eyes are trying to optimize for visibility, so in different environments, it's more crucial to see different colors, since some colors are indeed more visible in environments of varying brightness too.
kujjwal
When performing these calculations within our code, is it always practical to integrate from the bounds of zero to infinity or are there some mathematical shortcuts or tighter bounds we can use to reduce the computational complexity? It seems that performing this computation for high resolution images in our rendering pipeline would be quite rate limiting, so what roundoff or computational shortcuts do industry programmers use to make this more efficient?
jacky-p
From my understanding Radiometry is the overall study of illumination/light, and photometry is the sub-branch that deals with light on the spectrum that is visible to humans. It is interesting how our brain perceives light in different ways due to the surrounding environment. And in order to create computer graphics to mimic reality we must also implement these adjustments to our programs. Basically rather than just working with the science of light we must also work with how our brains interpret light.
diandestroyer
I came across some interesting stories about synesthesia recently and one of them detailed the fact that although we can not visually perceive ultraviolet and infrared as we do visible light, our brain does process and interpret these wavelengths to an extent. It went on to detail how some people with synesthesia actually report seeing some colors associated with ultraviolet or infrared stimuli. This just made me a bit curious if there were actually studies done and if there was a way to quantify their scotopic and photopic visual ranges.
weszhuang
When calculating these integrals in our practice is it accelerated by fast vector multiplication algorithms to approximate it using forrier transforms? Or is this generally not necessary due to a low range of light frequency sample points.
I have always known that your eyes can see better in the dark at night, but I never realized that the range of colors you see are different too. I understand that part of Professor Ren's research is to create artificial colors, and I wonder if part of this involves manipulating how our rods and cones react to light.
Does this mean that luminous flux depends on the surrounding lighting conditions, since V(lambda) would vary? This seems like it would be hard to deal with, as if you have a bright enough light it could trigger the eye to partially change its adaptation state?
How can the difference in the range of colour that people see be used in cg industry? Will this be considered in the rendering pipeline as the brightness of monitor various?
I wonder if this trend has to do with the fact that higher wavelengths of light have less energy, so your eyes are trying to balance out the extra light by being less receptive to higher energy frequencies?
Does this mean that to make a realistic game, I should change the color of light emitted on top of decreasing the brightness of light sources or would this be automatically taken care of based on the scenario by our eyes?
@jinweiwong In terms of how our rods and cones react to light, I believe rod cells are highly sensitive to lights with lower wavelengths but are not sensitive to color. As a result, scotopic vision provides monochromatic vision in dark environments. In contrast, cone cells are responsible for color vision and high visual acuity.
How does the photometric measurement of light impact the design of lighting in virtual environments, where the goal is to mimic real-world lighting conditions as closely as possible?
I really enjoyed learning more about how my eyes adapt. I knew before that my eyes probably adjusted to darker environments, which is why my eyes were always so sensitive when I finally left my room and was in the sun. However, I never knew that this changed the color spectrum as well.
I agree with jinweiwong. I never knew that the perceived color spectrum changes based on the surrounding light. I guess I naively thought the brain had some processing units for lights of different frequencies, but I guess it might be the eye that works as the initial filter... Interesting! More reason to take cogsci classes haha
I guess this makes sense if we're considering that our brain and eyes are trying to optimize for visibility, so in different environments, it's more crucial to see different colors, since some colors are indeed more visible in environments of varying brightness too.
When performing these calculations within our code, is it always practical to integrate from the bounds of zero to infinity or are there some mathematical shortcuts or tighter bounds we can use to reduce the computational complexity? It seems that performing this computation for high resolution images in our rendering pipeline would be quite rate limiting, so what roundoff or computational shortcuts do industry programmers use to make this more efficient?
From my understanding Radiometry is the overall study of illumination/light, and photometry is the sub-branch that deals with light on the spectrum that is visible to humans. It is interesting how our brain perceives light in different ways due to the surrounding environment. And in order to create computer graphics to mimic reality we must also implement these adjustments to our programs. Basically rather than just working with the science of light we must also work with how our brains interpret light.
I came across some interesting stories about synesthesia recently and one of them detailed the fact that although we can not visually perceive ultraviolet and infrared as we do visible light, our brain does process and interpret these wavelengths to an extent. It went on to detail how some people with synesthesia actually report seeing some colors associated with ultraviolet or infrared stimuli. This just made me a bit curious if there were actually studies done and if there was a way to quantify their scotopic and photopic visual ranges.
When calculating these integrals in our practice is it accelerated by fast vector multiplication algorithms to approximate it using forrier transforms? Or is this generally not necessary due to a low range of light frequency sample points.