ARGear SDK Documentations
  • Introduction
  • Android
    • 1. Quickstart
    • 2. Configuration Settings
    • 3. ARGear Overview
    • 4. ARGSession
    • 5. Camera
    • 6. Rendering
    • 7. CMS Service
    • 8. Download Contents
    • 9. Set Contents
    • 10. Media
    • 11. Switch Camera Face
    • 12. Enable Debugging Mode
    • 13. ARCore Connect API
    • 14. API Reference
  • iOS
    • 1. Quickstart
    • 2. Configuration Settings
    • 3. ARGear Overview
    • 4. ARGSession
    • 5. Camera
    • 6. Rendering
    • 7. CMS Service
    • 8. Download Contents
    • 9. Set Contents
    • 10. Media
    • 11. Switch Camera Face
    • 12. Enable Debugging Mode
    • 13. ARKit Connect API
    • 14. API Reference
  • Unity
    • 1. Quickstart
    • 2. Configuration Settings
    • 3. ARGear Plugin Overview
    • 4. ARGearManager
    • 5. ARGearCamera
    • 6. CMS Service
    • 7. Download Contents
    • 8. Set Contents
    • 9. Switch Camera Face
    • 10. Enable Debugging Mode
    • 11. API Reference
Powered by GitBook
On this page

Was this helpful?

  1. iOS

5. Camera

Previous4. ARGSessionNext6. Rendering

Last updated 4 years ago

Was this helpful?

Camera configuration information and video related data should be provided in the form of to ARGear. Using the data, ARGear provides Face Tracking features.

Using AVCaptureSession, set up your camera first. Video frames and related face information can be obtained from and in AVCaptureVideoDataOutput and AVCaptureMetadataOutput classes respectively.

Then, pass sampleBuffer and connection, or metadataObjects and connection received from Delegate functions above to updateSampleBuffer and updateMetadataObjects in ARGSession Interface.

The sample code below describes an example of camera configuration and how to feed video frames to ARGear.

// Sample Code. Example of Passing sampleBuffer and metadataObjects to ARGSession

ARGSession *argSession = [[ARGSession alloc] initWithARGConfig:argConfig error:&error];
 
// AVCaptureVideoDataOutput Delegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    [argSession updateSampleBuffer:sampleBuffer fromConnection:connection];
}
// Sample Code. Example of Passing sampleBuffer and metadataObjects to ARGSession

// AVCaptureVideoDataOutput Delegate
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
    argSession?.update(sampleBuffer, from: connection)
}

CMSampleBufferRef
AVCaptureVideoDataOutputSampleBufferDelegate
AVCaptureMetadataOutputObjectsDelegate