It’s a debate as old as photography itself. On Friday, Reddit user u/ibreakphotos posted a few photos of the Moon that had the internet grappling with a familiar question: what is “truth” in photography?
The images in question show a blurred Moon alongside a much sharper and clearer version. The latter is a better image, but there’s one major problem with it. It’s not real — at least in the sense that most of us think of a photo as real. Instead, it’s an image generated by a Samsung phone based on a crappy photo of the Moon, which it then ran through some sophisticated processing to fudge some of the details. It may seem like a stretch to call that a photo, but given everything that smartphone cameras already do, it’s not actually the giant leap it appears to be — more like a small step.
Samsung is no stranger to machine learning — it has spent the past several years toying with high zoom enhanced by AI by way of its aptly named Space Zoom. In most situations, Space Zoom combines data from an optical telephoto lens with multiple frames captured in quick succession, leaning on machine learning to come up with a much sharper image of distant subjects than you could normally get with a smartphone camera. It’s really good.
That’s not exactly what Samsung seems to be doing here. Outside of Moon photography, Samsung’s processing pipeline only works with the data in front of it. It will sharpen up the edges of a building photographed from several blocks away with an unsteady hand, but it won’t add windows to the side of the building that weren’t there to begin with.
The Moon seems to be a different case, and ibreakphotos’ clever test exposes the ways that Samsung is doing a little extra processing. They put an intentionally blurred image of the Moon in front of the camera, displayed it on a screen, and took a photo of it. The resulting image shows details that it couldn’t have possibly pulled from the original photo because they were blurred away — rather, Samsung’s processing doing a little more embellishment: adding lines and, in a follow-up test, putting Moon-like texture in areas clipped to white in the original image. It’s not wholesale copy-and-pasting, but it’s not simply enhancing what it sees.
But… is that so bad? The thing is, smartphone cameras already use a lot of behind-the-scenes techniques in an effort to produce photos that you like. Even if you turn off every beauty mode and scene optimizing feature, your images are still being manipulated to brighten faces and make fine details pop in all the right places. Take Face Unblur on recent Google Pixel phones: if your subject’s face is slightly blurred from motion, it will use machine learning to combine an image from the ultrawide camera with an image from your main camera to give you a sharp final image.
Have you tried taking a picture of two toddlers both looking at the camera at the same time? It’s arguably harder than taking a picture of the Moon. Face Unblur makes it much easier. And it’s not a feature you enable in settings or a mode you select in the camera app. It’s baked right in, and it just works in the background.
To be clear, this isn’t the same thing that Samsung is doing with the Moon — it’s combining data from photos you’ve actually taken — but the reasoning is the same: to give you the picture you actually wanted to take. Samsung just takes it a step further than Face Unblur or any photo of a sunset you’ve ever taken with a smartphone.
Every photo taken with a digital camera is based on a little computer making some guesses
The thing is, every photo taken with a digital camera is based on a little computer making some guesses. That’s true right down to the individual pixels on the sensor. Each one has either a green, red, or blue color filter. A pixel with a green color filter can only tell you how green something is, so an algorithm uses neighboring pixel data to make a good guess at how red and blue it is, too. Once you’ve got all that color data sorted out, then there are a lot more judgments to make about how to process the photo.
Year after year, smartphone cameras go a step further, trying to make smarter guesses about the scene you’re photographing and how you want it to look. Any iPhone from the previous half-decade will identify faces in a photo and brighten them for a more flattering look. If Apple suddenly stopped doing this, people would riot.
It’s not just Apple — any modern smartphone camera does this. Is that a picture of your best friend? Brighten it up and smooth out those shadows under their eyes! Is that a plate of food? Boost that color saturation so it doesn’t look like a plate of Fancy Feast! These things all happen in the background, and generally, we like them.
Would it be weird if, instead of just bumping up saturation to make your dinner look appealing, Samsung added a few sprigs of parsley to the image? Absolutely. But I don’t think that’s a fair comparison to Moon-gate.
Samsung isn’t putting the Eiffel Tower or little green men in the picture
The Moon isn’t a genre of photo the way “food” is. It’s one specific subject, isolated against a dark sky, that every human on Earth looks at. Samsung isn’t putting the Eiffel Tower or little green men in the picture — it’s making an educated guess about what should be there to begin with. Moon photos with smartphones also look categorically garbage, and even Samsung’s enhanced versions still look pretty terrible. There’s no danger of someone with an S23 Ultra winning Astrophotographer of the Year.
Samsung is taking an extra step forward with its Moon photo processing, but I don’t think it’s the great departure from the ground “truth” of modern smartphone photography that it appears to be. And I don’t think it means we’re headed for a future where our cameras are just Midjourney prompt machines. It’s one more step on the journey smartphone cameras have been on for many years now, and if we’re taking the company to court over image processing crimes, then I have a few more complaints for the judge.