Apple’s new iPhone 15 is an underwhelming ‘slap in the face,’ say disappointed fans::Apple unveiled its new iPhone 15 models this week, and some fans say they lack innovation.
Apple’s new iPhone 15 is an underwhelming ‘slap in the face,’ say disappointed fans::Apple unveiled its new iPhone 15 models this week, and some fans say they lack innovation.
Correct me if I’m wrong, but is portrait mode just a blurred background? Google photos has had the ability to blur backgrounds ( at least on Pixel) on old photos for a fair while.
The iPhone’s portrait mode uses actual depth information captured from the separate depth sensor. The new feature is that it will always capture the depth information for every picture you take so that at a later point you can use it to blur parts of the image at different depths. Google’s version of portrait mode just uses image recognition to detect what’s in the background. It does a good job, but not as good as if it had actual depth information.
It is bokeh - in some ways similar to a DSLR - I’d say the apple implementation is better - but that’s just me