Google’s new Magic Editor pushes us toward AI-perfected fakery

Image: Google

One of the most impressive demos at Google I/O started with a photo of a woman in front of a waterfall. A presenter onstage tapped on the woman, picked her up, and moved her to the other side of the image, with the app automatically filling in the space where she once stood. They then tapped on the overcast sky, and it instantly bloomed into a brighter cloudless blue. In just a matter of seconds, the image had been transformed.

The AI-powered tool, dubbed the Magic Editor, certainly lived up to its name during the demo. It’s the kind of tool that Google has been building toward for years. It already has a couple of AI-powered image editing features in its arsenal, including the Magic Eraser, which lets you quickly remove people or objects from the background of an image. But this type of tool takes things up a notch by letting you alter the contents — and potentially, the meaning — of a photo in much more significant ways.

While it’s clear that this tool isn’t flawless — and there remains no firm release date for it — Google’s end goal is clear: to make perfecting photos as easy as just tapping or dragging something on your screen. The company markets the tool as a way to “make complex edits without pro-level editing tools,” allowing you to leverage the power of AI to single out and transform a portion of your photo. That includes the ability to enhance the sky, move and scale subjects, as well as remove parts of an image with just a few taps.

Google’s Magic Editor attempts to package all the steps that it would take to make similar edits in a program like Photoshop into a single tap — or, at least, that’s what it looks like from the demo. In Photoshop, for example, you’re stuck using the Content-Aware Move tool (or any of the other methods of your choice) to pick up and move a subject inside of an image. Even then, the photo still might not look quite right, which means you’ll have to pick up other tools, like the Clone Stamp tool or maybe even the Spot Healing Brush, to fix any leftover artifacts or a mismatched background. It’s not the most complicated process ever, but as with most professional creative tools, there’s a definite learning curve for people who are new to the program.

I’m all for Google making photo editing tools free and more accessible, given that Photoshop and some of the other image editing apps out there are expensive and pretty unintuitive. But putting powerful and incredibly easy-to-use image editing tools into the hands of, well, just about everyone who downloads Google Photos could transform the way we edit — and look at — photos. There have long been discussions about how far a photo can be edited before it’s no longer a photo, and Google’s tools push us closer to a world where we tap on every image to perfect it, reality or not.

Samsung recently brought attention to the power of AI-“enhanced” photos with “Space Zoom,” a feature that’s supposed to let you capture incredible pictures of the Moon on newer Galaxy devices. In March, a Reddit user tried using Space Zoom on an almost unsalvageable image of the Moon and found that Samsung appeared to add craters and other patches that weren’t actually there. Not only does this run the risk of creating a “fake” image of the Moon, but it also leaves actual space photographers in a strange place, as they spend years mastering the art of capturing the night sky, only for the public to often be presented with fakes.

Source: https://www.theverge.com/23721763/google-magic-editor-ai-photos-pixel-fakery

Exit mobile version