With this feature, the company claimed that you can take extraordinary and precise photos from far away, for instance, even the moon. Back in 2021, the South Korean giant was accused of false advertising and misleading customers by using image overlay or texture effects on higher-resolution images of moon shots with the Space Zoom feature. To this, the company had clarified that “the Scene Optimizer process is not overlaying any textures or adding fake images” but it uses AI to identify the presence of the Moon and “then offers a detail enhancement feature by reducing blur and noise.” Now, a Redditor has again accused Samsung of faking shots of the Moon with its AI-powered 100x “Space Zoom” feature in its latest flagship smartphone, Galaxy S23 Ultra. Although the moon images from the device look stunning, they are simply not real. In a widely-shared post on social media, Reddit user u/ibreakphotos tested a photo of an intentionally low-resolution image of the Moon to prove that the 100x zoom level uses deception to generate highly-detailed images of the lunar surface. For the testing, the user downloaded a high-resolution image of the Moon from the internet, downsized it to 170 x 170 pixels, and applied a gaussian blur, so that all the details are gone. Further, the blurred image was set up on a computer screen, the lights in the room were switched off, and zoomed in on the monitor using a Samsung Galaxy device. Despite the shortage of detail, the final image had considerably more captured elements than its original source. “The moon pictures from Samsung are fake. Samsung’s marketing is deceptive,” the Redditor wrote in a lengthy Reddit post. “It is adding detail where there is none (in this experiment, it was intentionally removed)….The reality is, it’s AI doing most of the work, not the optics, the optics aren’t capable of resolving the detail that you see. Since the Moon is tidally locked to the Earth, it’s very easy to train your model on other moon images and just slap that texture when a moon-like thing is detected.” According to the Reddit user u/ibreakphotos claimed Samsung’s phones are ‘leveraging an AI model to put craters and other details on places which were just a blurry mess’. The user added: “There’s a difference between additional processing a la super-resolution, when multiple frames are combined to recover detail which would otherwise be lost, and this, where you have a specific AI model trained on a set of moon images, in order to recognize the Moon and slap on the Moon texture on it (when there is no detail to recover in the first place, as in this experiment). “This is not the same kind of processing that is done when you’re zooming into something else, when those multiple exposures and different data from each frame account to something. This is specific to the Moon.” Samsung has always openly admitted to using AI and deep learning when taking pictures of the moon (via an official Samsung announcement in 2022), which is no different from computational photography used in the Pixel or iPhones. “The moon recognition engine was created by learning various moon shapes from full moon to crescent moon based on images that people actually see with their eyes on Earth,” the company said. “The process starts by identifying an object based on a realistic human eye view, then multi-frame fusion and upscaling add on by generating a higher level of detail to the subject, finally leveraged by AI deep learning solution it uses contextual assumption to process and piece together all the information to delivering a high-quality result.” Samsung has yet to clarify the recent allegation made by the Redditor that the images of the moon shots are “fake” or if they are actually “AI-enhanced,” as claimed by the South Korean tech giant.