For years, “Space Zoom”-enabled Samsung phones have been known for their ability to take incredibly detailed photos of the Moon. But as a recent Reddit post clearly points to the level of computer processing the company is doing, and given the evidence, it seems we should go ahead and say: Samsung’s Moon shot is wrong.
But what exactly does “false” mean in this scenario? This is a difficult question to answer, and one that will become increasingly important and complex as computer techniques become more deeply integrated into the imaging process. We can say with certainty that our understanding of what makes a photo look artificial will soon change, just like before to accommodate digital cameras, Photoshop, Instagram filters, etc But for now, let’s focus on the case of Samsung and the Moon.
The Samsung phone test conducted by Reddit user u/ibreakphotos is simple. They created a deliberately blurred photo of the Moon, displayed it on a computer screen, and then captured it with a Samsung S23 Ultra. As you can see below, the first image on the screen shows no detail, but the resulting image shows a sharp, clear “photo” of the Moon. The S23 Ultra has added details that weren’t there before. No blurred pixel scaling and no recovery of seemingly lost data. There is only one new moon – a false moon.