Skip to main content

Offscreen Rendering Guide

Follow this guide to enable DeepAR iOS SDK to produce frames in pixel buffer format. These frames can be used, among other use-cases, as an input for video-calling and streaming APIs.

Off-screen rendering iOS

  • Create DeepAR object

  • call (void)setLicenseKey:(NSString)* key to activate licence key

  • set the delegate of the DeepAR object

  • call (void)changeLiveMode:(BOOL)liveMode; with false.

  • call (void)initializeOffscreenWithWidth:(NSInteger)width height:(NSInteger)height

  • feed the frames through (void)processFrame:(CVPixelBufferRef)imageBuffer mirror:(BOOL)mirror or (void)enqueueCameraFrame:(CMSampleBufferRef)sampleBuffer mirror:(BOOL)mirror if you are using camera frames via AVFoundation

  • rendered frames will be delivered through void (void)frameAvailable:(CMSampleBufferRef)sampleBuffer via the DeepARDelegate.

Sample Swift initialisation code:

    deepAr = DeepAR() 
deepAr?.setLicenseKey("Your license key here!")
deepAr?.delegate = self
deepAr?.changeLiveMode(false)
deepAr?.initializeOffscreen(withWidth: 1080, height: 1920)

Sample ObjectiveC initialisation code:

    self.deepAR = [[DeepAR alloc] init]; 
[self.deepAR setLicenseKey:@"Your license key here!"]; self.deepAR.delegate = self;
[self.deepAR changeLiveMode:NO];
[self.deepAR initializeOffscreenWithWidth:1080 height:1920];