You are viewing the course site for a past offering of this course. The current offering may be found here.
Lecture 18: Color Science (35)
CeHao1

This technique is very common in the camera system. It is very useful when we take photos in a very dark environment.

The digital processor will automatically extend the range of luminance to 0-255 so that we can actually make full use of the perception.

But a very bad consequence is that in some videos when a fire is suddenly set off, the surrounding looks very dark. This is because the fire is too much bright and the camera has to reduce its brightness to a finite value, and therefore the color of other objects is also tuned down to be darker.

chethus

This transformation doesn't change any RGB values of 0 in the raw input. Is this desired or are there other white balancing schemes that do this?

phoebeli23

If the raw input was (255, 255, 255) (white) and the raw input of white object was also (255, 255, 255), wouldn't the white balanced output become (1, 1, 1) (black)?

nobugnohair

I think in this formula (1, 1, 1) is white and (0, 0, 0) is black as we used them in the previous project.

adityaramkumar

What do the coefficients 1/R'w, 1/G'w, and 1/B'w represent (I know it says raw input of white object, but geometrically). What if we didn't have 0's in our matrix? How would that change the RGB output values?

melodysifry

I've noticed that when taking pictures on my phone camera, automatic white balance is applied, but it also lets you tap on a certain part of the scene and the lighting across the entire image will adjust so that the area that you tapped is well lit (objects in this area are clearly visible but not too dark or too light). It's hard to tell whether this is just changing the exposure of the image, or whether it's recoloring the image as well to make it warmer or cooler. When you do this, is it actually adjusting the white balance by selecting a different value for "white" relative to the area of the scene you tap, and recoloring the whole scene accordingly? Or is simply adjusting the exposure?

seenumadhavan

In a similar vein to @melodysifry's question, I'm curious how our phones are able to show a preview of how the image would change if you, say, tap a part of the scene. I believe the two options are that our phones either display the raw camera feed with the modified settings (I believe this is how digital cameras work), or they apply computational photography techniques to digitally alter the raw camera feed as if it had the modified settings. I believe it's the latter, but I wasn't able to find a definitive answer online.

You must be enrolled in the course to comment