13. ARKit Connect API

ARKit can be connected to ARGear. Once connected, you can utilize functions in both ARKit and ARGear (items, beauty, and bulge) together.

We provide Swift based sample code to get you started. For details on ARKit, please refer to “https://developer.apple.com/documentation/arkit”.

13.1 Configuration

Once 1.Configuration Settings is done, you should import ARKit.

import ARKit

For ARKit connection, ARGearRenderer.framework instead of ARGear.framework needs to be used.

13.2 Session Creation and Execution

Both ARGSession and ARCore Session need to be created and executed together. For ARGSession, please refer to 3. ARGSession chapter.

When creating ARGSession, ARGInferenceFeature setting is not required for ARKit.

The sample code below shows ARGSession creation and execution when configured for ARKit.

let config = ARGConfig(
    apiURL: API_HOST,
    apiKey: API_KEY,
    secretKey: API_SECRET_KEY,
    authKey: API_AUTH_KEY
)

argSession = try ARGSession(argConfig: config)
argSession?.delegate = self
} catch let error as NSError {
} catch let exception as NSException {
}

argSession?.run()

The sample code below shows ARKit Session creation and execution.

arKitSession = ARSession()

let arkitFaceTrackingConfig = ARFaceTrackingConfiguration()
arKitSession?.delegate = self
arKitSession?.run(arkitFaceTrackingConfig)

13.3 Rendering

In order to draw a result frame on screen, the camera CVPixelBufferRef and ARAnchor information of the updated ARFrame from the ARSessionDelegate need to be passed to ARGSession.

For this, applyAdditionalFaceInfo function in ARGSession should be called with the transform and geometry.vertices of ARFaceAnchor, the size of the result View, and the ARGSessionProjectPointHandler (a converting function) as parameters.

Once didUpdate(_ arFrame: ARGFrame) function of the ARGSessionDelegate is called, rendering proceeds according to the provided information from ARKit.

When Item, Filter, or Beauty is set, a resulting ARGFrame will be rendered together with ARKit and returned.

In ARGFrame, a final rendered result will be set in a CVPixelBuffer and you can draw the result with it.

CVPixelBuffer may require rotation or flipping to show the frame correctly. For more details, please refer to drawARCameraPreview() from our official sample code.

Sample code for rendering is written below.

// ARSessionDelegate
public func session(_ session: ARSession, didUpdate frame: ARFrame) {

    let viewportSize = view.bounds.size
    var updateFaceAnchor: ARFaceAnchor? = nil
    var isFace = false
    if let faceAnchor = frame.anchors.first as? ARFaceAnchor {
        if faceAnchor.isTracked {
            updateFaceAnchor = self.currentARKitFaceAnchor
            isFace = true
        }
    } else {
        if let _ = frame.anchors.first as? ARPlaneAnchor {
        }
    }

    let handler: ARGSessionProjectPointHandler = { (transform: simd_float3, orientation: UIInterfaceOrientation, viewport: CGSize) in
        return frame.camera.projectPoint(transform, orientation: orientation, viewportSize: viewport)
    }
        
    if isFace {
        if let faceAnchor = updateFaceAnchor {
            self.argSession?.applyAdditionalFaceInfo(withPixelbuffer: frame.capturedImage, transform: faceAnchor.transform, vertices: faceAnchor.geometry.vertices, viewportSize: viewportSize, convert: handler)
        } else {
            self.argSession?.feedPixelbuffer(frame.capturedImage)
        }
    } else {
            self.argSession?.feedPixelbuffer(frame.capturedImage)
    }
}

// ARGSessionDelegate
public func didUpdate(_ arFrame: ARGFrame) {
    guard let _ = arKitSession.configuration as? ARFaceTrackingConfiguration else  {
        return
    }

    if let cvPixelBuffer = arFrame.renderedPixelBuffer {
        // draw cvPixelBuffer
    }
}

13.4 Set Contents

Please refer to the chapter, 8. Set Contents.

Last updated