Improvements in augmented reality
In the new ARKit 4 for iOS 14 and iPadOS 14, there are four big new features:
- Location Anchors allow developers to place AR scenes at geographic coordinates. Location Anchors enable users to display those AR experiences at specific locations, landmarks, and places around the world.
- Extended Face tracking support allows AR experiences accessible via the front camera in devices with the A12 Bionic chip or a later version.
- RealityKit will enable developers to add video textures to any part of the AR scene or AR object. Video textures also include spatialized audio.
- Scene Understanding has one objective: to make virtual content interact with the real world. Scene Understanding is a new option set into the
ARView
–Environment
object. Scene Understanding contains four options: Occlusion, where real-world objects occlude virtual objects; Receives Lighting, which allows virtual objects to cast shadows on real-world objects; Physics, which enables virtual objects to interact with the real world physically; and Collision, which enables collisions between virtual objects and real-world objects.Note
Activating the Receives Lighting option automatically turns on Occlusion. Activating the Physics option automatically turns on Collision.
In this section, we have seen the improvements in augmented reality. Let's now review what's new in machine learning.