Apple revealed ARKit 3, its
latest set of developer tools for creating AR applications on iOS. ARKit 3 now
offers real-time body tracking of people in the scene as well as occlusion,
allowing AR objects to be convincingly placed in front of and behind those
people. Apple also introduced Reality Composer and RealityKit to make it easier
for developers to build augmented reality apps. ARKit 3 understands the
position of people in the scene. Knowing where the person is allows the system
to correctly composite virtual objects with regard to real people in the scene,
rendering those objects in front of or behind the person depending upon which
is closer to the camera.
RealityKit is designed to make it
easier for developers to build augmented reality apps on iOS. Building AR apps
requires a strong understanding of 3D app development, tools, and workflows.
This makes it less likely for developers to jump into something new like AR,
and Apple is clearly trying to help smooth that transition. RealityKit almost
sounds like a miniature game engine, including photo-realistic rendering,
camera effects, animations, physics and more. Rather than asking iOS developers
to learn game engine tools like Unity or Unreal Engine, it seems that
RealityKit will be an option that Apple hopes will be easier and more familiar
to its developers.
More information: