Apple's new Object Capture API can create photorealistic 3D models using only an iPhone
Apple’s newly launched RealityKit 2 comes with new tools to build even more immersive AR experiences. The update to RealityKit, Apple’s 3D rendering engine for AR, includes a new Object Capture API capable of creating photorealistic 3D models using only an iPhone. Users can take photos of an object from all angles using an iPhone, iPad, or a DSLR camera and then use the API on macOS Monterey to generate a 3D model. The model can then be viewed in AR Quick Look and added to AR scenes in XCode or Reality Composer.
After the announcement, Shopify’s development manager for AR/VR, Mikko Haapoja, shared a tweet where he used the Object Capture API with an iPhone 12 Pro Max to create a 3D model of a running shoe. He tweeted, “Apple's Object Capture is the real deal. I'm impressed.”
Zoom Out: What does this mean for the future, though? We can envision a tomorrow in which product marketplaces will become even more digital than it is today. Merchants will be able to simply create a 3D model of their physical products within minutes, which consumers will then be able to visualize within their homes through AR glasses. Furthermore, people will also be able to take 3D models of any personal memorabilia to any VR experience they enter directly. How far are we from this future? Not too far, it seems. Apple claims there are more than 1 billion AR-enabled devices powered by its ARKit framework, which works with RealityKit. There are now 14,000 ARKit apps on the App Store.