13. ARKit Connect API
ARKit can be connected to ARGear. Once connected, you can utilize functions in both ARKit and ARGear (items, beauty, and bulge) together.
We provide Swift based sample code to get you started. For details on ARKit, please refer to “https://developer.apple.com/documentation/arkit”.

13.1 Configuration

Once 1.Configuration Settings is done, you should import ARKit.
1
import ARKit
Copied!
For ARKit connection, ARGearRenderer.framework instead of ARGear.framework needs to be used.

13.2 Session Creation and Execution

Both ARGSession and ARCore Session need to be created and executed together. For ARGSession, please refer to 3. ARGSession chapter.
When creating ARGSession, ARGInferenceFeature setting is not required for ARKit.
The sample code below shows ARGSession creation and execution when configured for ARKit.
1
let config = ARGConfig(
2
apiURL: API_HOST,
3
apiKey: API_KEY,
4
secretKey: API_SECRET_KEY,
5
authKey: API_AUTH_KEY
6
)
7
8
argSession = try ARGSession(argConfig: config)
9
argSession?.delegate = self
10
} catch let error as NSError {
11
} catch let exception as NSException {
12
}
13
14
argSession?.run()
Copied!
The sample code below shows ARKit Session creation and execution.
1
arKitSession = ARSession()
2
3
let arkitFaceTrackingConfig = ARFaceTrackingConfiguration()
4
arKitSession?.delegate = self
5
arKitSession?.run(arkitFaceTrackingConfig)
Copied!

13.3 Rendering

In order to draw a result frame on screen, the camera CVPixelBufferRef and ARAnchor information of the updated ARFrame from the ARSessionDelegate need to be passed to ARGSession.
For this, applyAdditionalFaceInfo function in ARGSession should be called with the transform and geometry.vertices of ARFaceAnchor, the size of the result View, and the ARGSessionProjectPointHandler (a converting function) as parameters.
Once didUpdate(_ arFrame: ARGFrame) function of the ARGSessionDelegate is called, rendering proceeds according to the provided information from ARKit.
When Item, Filter, or Beauty is set, a resulting ARGFrame will be rendered together with ARKit and returned.
In ARGFrame, a final rendered result will be set in a CVPixelBuffer and you can draw the result with it.
CVPixelBuffer may require rotation or flipping to show the frame correctly. For more details, please refer to drawARCameraPreview() from our official sample code.
Sample code for rendering is written below.
1
// ARSessionDelegate
2
public func session(_ session: ARSession, didUpdate frame: ARFrame) {
3
4
let viewportSize = view.bounds.size
5
var updateFaceAnchor: ARFaceAnchor? = nil
6
var isFace = false
7
if let faceAnchor = frame.anchors.first as? ARFaceAnchor {
8
if faceAnchor.isTracked {
9
updateFaceAnchor = self.currentARKitFaceAnchor
10
isFace = true
11
}
12
} else {
13
if let _ = frame.anchors.first as? ARPlaneAnchor {
14
}
15
}
16
17
let handler: ARGSessionProjectPointHandler = { (transform: simd_float3, orientation: UIInterfaceOrientation, viewport: CGSize) in
18
return frame.camera.projectPoint(transform, orientation: orientation, viewportSize: viewport)
19
}
20
21
if isFace {
22
if let faceAnchor = updateFaceAnchor {
23
self.argSession?.applyAdditionalFaceInfo(withPixelbuffer: frame.capturedImage, transform: faceAnchor.transform, vertices: faceAnchor.geometry.vertices, viewportSize: viewportSize, convert: handler)
24
} else {
25
self.argSession?.feedPixelbuffer(frame.capturedImage)
26
}
27
} else {
28
self.argSession?.feedPixelbuffer(frame.capturedImage)
29
}
30
}
31
32
// ARGSessionDelegate
33
public func didUpdate(_ arFrame: ARGFrame) {
34
guard let _ = arKitSession.configuration as? ARFaceTrackingConfiguration else {
35
return
36
}
37
38
if let cvPixelBuffer = arFrame.renderedPixelBuffer {
39
// draw cvPixelBuffer
40
}
41
}
Copied!

13.4 Set Contents

Please refer to the chapter, 8. Set Contents.
Last modified 1yr ago