Is there a globally-used algorithm for stitching HDR images together? I was wondering if each smartphone company uses their own or if there is a widely-used one that is very effective.
Carpetfizz
@ellenluo There's probably several different implementations of HDR but I believe the most influential paper is from Paul Debevec and Jitendra Malik from UC Berkeley! It's called "Recovering High Dynamic Range Radiance Maps from Photographs"
Google also has an HDR+ algorithm, which produces noticeably better pictures than the HDR algorithm in the iPhone camera. It increases the range that the picture can have with correct exposure. Ihttps://thenextweb.com/plugged/2017/09/08/hdr/
andrewdcampbell
I was curious to see how "smart HDR" that the iPhone XS has differs from previous versions of HDR on the iPhone. It turns out it just takes more shots in between the overexposure, underexposure, images. That way it can allow brighter colors in the underexposure to look better and not blown out.
upasanachatterjee
I'm wondering why HDR isn't the default for cameras now, is it because it's that much more intensive?
muminovic
My guess would be same as yours, it seems like its not very efficient; even trying to just take a picture in HDR mode on an iphone always feels significantly slower than a regular photo and they probably don't want to pre-apply that to all consumers since most people taking pictures on their iphones aren't shooting for amazing photographs anyway
Carpetfizz
I also wanted to share some slides from CS194-26 about the HDR algorithm. It goes into a lot more detail about where HDR fits into the imaging pipeline and some surprisingly short pseudocode for the algorithm.
Is there a globally-used algorithm for stitching HDR images together? I was wondering if each smartphone company uses their own or if there is a widely-used one that is very effective.
@ellenluo There's probably several different implementations of HDR but I believe the most influential paper is from Paul Debevec and Jitendra Malik from UC Berkeley! It's called "Recovering High Dynamic Range Radiance Maps from Photographs"
http://www.pauldebevec.com/Research/HDR/debevec-siggraph97.pdf
Google also has an HDR+ algorithm, which produces noticeably better pictures than the HDR algorithm in the iPhone camera. It increases the range that the picture can have with correct exposure. Ihttps://thenextweb.com/plugged/2017/09/08/hdr/
I was curious to see how "smart HDR" that the iPhone XS has differs from previous versions of HDR on the iPhone. It turns out it just takes more shots in between the overexposure, underexposure, images. That way it can allow brighter colors in the underexposure to look better and not blown out.
I'm wondering why HDR isn't the default for cameras now, is it because it's that much more intensive?
My guess would be same as yours, it seems like its not very efficient; even trying to just take a picture in HDR mode on an iphone always feels significantly slower than a regular photo and they probably don't want to pre-apply that to all consumers since most people taking pictures on their iphones aren't shooting for amazing photographs anyway
I also wanted to share some slides from CS194-26 about the HDR algorithm. It goes into a lot more detail about where HDR fits into the imaging pipeline and some surprisingly short pseudocode for the algorithm.
https://inst.eecs.berkeley.edu/~cs194-26/fa18/Lectures/HDR.pdf