Apple, with the release of its iPhone 11 lineup also announced a new computational photography feature dubbed ‘Deep fusion’ which will be rolled out the new iPhones with a software update this fall. The feature is Apple’s take on HTC and Google’s HDR+ which is quite popular in the Pixel series of phones.
Taking advantage of Apple’s progress in machine learning, this AI-based feature will allow users to take images with low noise, high detail, and great dynamic range just like the Pixel series.
The feature works by taking nine images, four before the shutter is pressed, four when the shutter is pressed and a single long exposure image. The neural analyzer then decides which images are the best and combines them pixel by pixel.
Not wanting to stray from the trend, Samsung has reportedly started working on a computational photography feature of its own.
The news was broken by a famous leakster @iceuniverse who, almost always, leaks the right and authentic stuff.
In a recent tweet, he reported that Samsung is working on its own version of deep fusion.
Samsung is also developing the “Deep Fusion”function similar to the iPhone 11 Pro, and really plays the role of the NPU in taking pictures and videos. We can look forward to the Galaxy S11.
— Ice universe (@UniverseIce) September 12, 2019
We might see the feature making its way to the market sometime next year with one of the flagships, or maybe like Apple, Samsung will roll out the feature with a software update to its existing flagships.