Numerous new iPhone features, such as editable Messages and a programmable lock screen, will be part of Apple’s iOS 16. But although it was only shown for less than 15 seconds during WWDC 2022, there was one feature that really caught my eye.
Although the function’s name is unknown, the following is how it operates: To distinguish a picture’s subject, such as a person, from the backdrop, tap and hold on to the image. If you continue to hold, you may “lift” the cutout from the image and drag it into another app to post, share, or create a collage, for instance.
Technically, the tap-and-lift photo function is a component of Visual Lookup, which debuted with iOS 15 and can identify a variety of things in your images, including pets, food, plants, and landmarks. With only a tap and hold on iOS 16, Visual Lookup allows you to remove that object from a picture or PDF.
On a picture of a French bulldog, Robby Walker, Apple’s senior director of Siri Language and Technologies, showcased the new tap-and-lift feature. The puppy was “cut out” of the picture before being inserted into a message’s text area.
Walker stated, “It seems like magic.
Even though Apple occasionally overuses the phrase “magic,” this gadget is indeed stunning. Walker was quick to point out that an advanced machine learning model, which is supercharged by core machine learning and Apple’s neural engine to conduct 40 billion operations in a second, is what caused the impact.
It excites me to no end to know how much processing and machine learning goes into removing a dog from a picture. New phone features frequently need to be ground-breaking or tackle important problems. You might say that the tap-and-hold tool addresses the issue of erasing a photo’s backdrop, which, to at least some people, maybe a major one.
To read our blog on “Apple’s ‘Lockdown Mode’ new iPhone feature prevents complex spyware,” click here