12. ARKit Connect API

ARKit can be connected to ARGear. Once connected, you can utilize functions in both of ARKit and ARGear (item, beauty, and bulge) together.

We provide Swift based sample codes. For the details of ARKit, please refer to “https://developer.apple.com/documentation/arkit”.

12.1 Configuration

Once 1.Configuration Settings is done, you should import ARKit.

import ARKit

12.2 Session Creation and Execution

Both of ARGSession and ARCore Session need to be created and executed together. For the ARGSession, please refer to 3. ARGSession chapter.

When creating ARGSession, ARGInferenceFeature needs to be set to ".extARKitFaceTracking".

A sample code below shows ARGSession creation and execution.

let config = ARGConfig(
apiKey: API_KEY,
secretKey: API_SECRET_KEY,
argSession = try ARGSession(argConfig: config, feature: .extARKitFaceTracking)
argSession?.delegate = self
} catch let error as NSError {
} catch let exception as NSException {

A sample code below shows ARKit Session creation and execution.

arKitSession = ARSession()
let arkitFaceTrackingConfig = ARFaceTrackingConfiguration()
arKitSession?.delegate = self

12.3 Rendering

In order to draw a result frame on screen, CVPixelBufferRef and ARAnchor information in updated ARFrame from session of ARSessionDelegate need to be passed to ARGSession.

For this, applyAdditionalFaceInfo function in ARGSession should be called with transform and geometry.vertices of ARFaceAnchor, the size of result view, and ARGSessionProjectPointHandler (a converting function) as parameters.

Once didUpdate(_ arFrame: ARGFrame) function of ARGSessionDelegate is called, rendering proceeds according to provided information from ARKit.

When Item, Filter, or Beauty is set, a result ARGFrame will be rendered altogether and returned.

In ARGFrame, a final rendered result will be set in CVPixelBuffer and you can draw the result with it.

CVPixelBuffer may require rotation or flip to show a frame correctly. For the details, please refer to drawARCameraPreview() from our official sample code.

A sample code for rendering is written below.

// ARSessionDelegate
public func session(_ session: ARSession, didUpdate frame: ARFrame) {
let viewportSize = view.bounds.size
var updateFaceAnchor: ARFaceAnchor? = nil
var isFace = false
if let faceAnchor = frame.anchors.first as? ARFaceAnchor {
if faceAnchor.isTracked {
updateFaceAnchor = self.currentARKitFaceAnchor
isFace = true
} else {
if let _ = frame.anchors.first as? ARPlaneAnchor {
let handler: ARGSessionProjectPointHandler = { (transform: simd_float3, orientation: UIInterfaceOrientation, viewport: CGSize) in
return frame.camera.projectPoint(transform, orientation: orientation, viewportSize: viewport)
if isFace {
if let faceAnchor = updateFaceAnchor {
self.argSession?.applyAdditionalFaceInfo(withPixelbuffer: frame.capturedImage, transform: faceAnchor.transform, vertices: faceAnchor.geometry.vertices, viewportSize: viewportSize, convert: handler)
} else {
} else {
// ARGSessionDelegate
public func didUpdate(_ arFrame: ARGFrame) {
guard let _ = arKitSession.configuration as? ARFaceTrackingConfiguration else {
if let cvPixelBuffer = arFrame.renderedPixelBuffer {
// draw cvPixelBuffer

12.4 Set Contents

Please refer to the chapter, 8. Set Contents.