Sometime earlier this month, Apple updated a section of its website that discloses how it collects and uses imagery for Apple Maps’ Look Around feature, which is similar to Google Maps’ Street View, as spotted by 9to5Mac. A newly added paragraph reveals that, beginning in March 2025, Apple will be using imagery and data collected during Look Around surveys to “train models powering Apple products and services, including models related to image recognition, creation, and enhancement.”

Apple collects images and 3D data to enhance and improve Apple Maps using vehicles and backpacks (for pedestrian-only areas) equipped with cameras, sensors, and other equipment including iPhones and iPads. The company says that as part of its commitment to privacy, any images it captures that are published in the Look Around feature have faces and license plates blurred. Apple also says it will only use imagery with those details blurred out for training models. It does accept requests for those wanting their houses to also be blurred, but by default they are not.

The Verge has reached out to Apple for confirmation on exactly what models will be trained using the imagery, and will update this story accordingly. Apple Intelligence has several features powered by AI image generation models. These include Image Playground, the Clean Up tool in Apple’s Photos app which can remove parts of an image, and advanced image recognition that improves the Photos app’s search capabilities.

By