Virtual Avatar
  • iOS : Objective-C
  • Android
  • Overview
  • Client SDKs
  • Demo app
  • Getting started
    • Integrate the SDK
    • Create a virtual avatar
    • ZegoCharacterHelper instructions
  • Guides
  • Best practice
  • Error codes
  • Server APIs
  • Documentation
  • ZEGO Avatar
  • Guides
  • Video recording

Video recording

Last updated:2022-12-23 13:03

ZEGOCLOUD's Virtual Avatar SDK provides a video recording feature, which allows your users to record avatars or audio captured by the microphone in videos at any time.

Your users can record the wonderful moments of avatars in videos and share them with friends.

Prerequisites

Before you implement the video recording feature, ensure that the following conditions are met:

Implementation steps

Refer to the following steps to implement the video recording feature.

1 Start video recording

Before video recording:
  • To save videos to your phone album, configure the Privacy - Photo Library Additions Usage Description permission first.
  • To record audio data from the microphone, configure the Privacy - Microphone Usage Description permission first.

For details about how to configure the permissions, see Set Permissions.

After building a basic avatar, call the startRecord API of ZegoAvatarView to import the configured parameters of video recording (ZegoRecordConfig). Among these parameters, videoPath is required and refers to the path where videos are saved. By default, videos are recorded in MP4 format.

// Customize the path where videos are saved and ensure that the path exists.
NSString *docDir = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES).firstObject;
_videoPath = [NSString stringWithFormat:@"%@/avatar.mp4", docDir];
// Assemble the path.
ZegoRecordConfig *config = [ZegoRecordConfig new];
// Required: Specify the location to save your videos.
config.videoPath = _videoPath;
// Optional: Record audio.
config.recordAudio = true;
// Optional: Specify the video format. The default format is MP4. Videos in MOV format are less universally supported and cannot be played on an Android device.
config.recordMode = ZegoAvatarRecordModeMP4;
// Optional: By default, the width of _avatarView is used as the video width. If specified, scaleWidth must be smaller than the width of _avatarView
config.scaleWidth = _avatarView.frame.size.width;
__weak typeof(self) weakSelf = self;
// Start video recording. When video recording fails to start, "false" is returned (due to no permission or error occurred while creating the encoder). Videos are recorded in a sub-thread, so an error that occurred during recording (such as frame encoding failure) will be returned through the callback.
BOOL success = [_avatarView startRecord:config onStartRecordCallback:^(NSInteger errorCode, NSString *info) {
    if(errorCode != ZegoAvatarErrorCode_Success){
        NSLog(@"Avatar recording error, code: %ld, msg: %@", (long)errorCode, info);
    }
}];
if (success) {
    [self.view makeToast:@"Avatar recording started successfully"];
}

When video recording fails to tart, "false" is returned (due to no permission or error occurred while creating the encoder). Videos are then recorded in a sub-thread. An error that occurred during recording will be returned through the callback. For details about error codes, see Common Error Codes.

2 Stop video recording

To stop video recording, you can call the stopRecord API of ZegoAvatarView. The SDK will save the video content as a video file and copy it to the path specified by videoPath.

__weak typeof(self) weakSelf = self;
// Stop video recording. When you call the API, video recording will stop, and the recorded video file will be written into the path specified by videoPath defined in startRecord.
// An error that occurred during the process will be returned through the callback.
[_avatarView stopRecord:^{
    // To save videos to the album, refer to the code below.
    __strong typeof(self) strongSelf = weakSelf;
    if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(strongSelf.videoPath)) {
        // Core code for saving videos to the album
        // For devices with iOS earlier than 12, add this permission: Privacy - Photo Library Additions Usage Description
        // Note: video:didFinishSavingWithError:contextInfo: refers to a custom callback that handles two conditions: success video saving to the album or failure of saving videos to the album.
        UISaveVideoAtPathToSavedPhotosAlbum(strongSelf.videoPath, strongSelf, @selector(video:didFinishSavingWithError:contextInfo:), nil);
    }
}];

3. Customize audio collection

If recordAudio of the ZegoRecordConfig object is set to "true", the SDK will use the built-in microphone collection module to collect audio data. The built-in collection module lacks compatibility. ZEGOCLOUD recommends that you use Express SDK to collect audio data. For details, refer to Custom Audio Capture and Rendering.

  1. You need to call the setCustomAudioDelegate API to customize the audio data collection agent. (You need to inherit AudioDataDelegate to implement the onStart and onStop methods.)
  2. After audio data is collected, call the sendAudioData API to send the data.
  • During video recording, if Audio Driver is enabled, and Custom Audio Captured is also enabled for Audio Driver, you only need to set the setCustomAudioDelegate API once. The SDK will reuse the settings.
  • If you use Express SDK to capture audio, you can refer to the ZegoAvatarExample/Express/ZegoExpressAudioCaptureDelegate.m file in the sample source code. The code has implemented the logic of custom audio pre-processing of the Express SDK, which can provide audio data to the Avatar SDK for video recording.
// ExpressAudioCaptureDelegate.h
@interface ExpressAudioCaptureDelegate : AudioDataDelegate<IAudioCapture>

@end

// ExpressAudioCaptureDelegate.m
@interface ExpressAudioCaptureDelegate()<ZegoEventHandler, ZegoCustomAudioProcessHandler>
{
    BOOL _isRunning;
}
@end

@implementation ExpressAudioCaptureDelegate

- (void)onStart{
    // Start audio collection.
    _isRunning = YES;
    // Configure Express and start custom video pre-processing.
}

- (void)onStop{
    // Stop audio collection.
    _isRunning = NO;
}

// This is the audio pre-processing callback of Express. Send data collected by Express to the Avatar SDK.
- (void)onProcessCapturedAudioData:(unsigned char * _Nonnull)data dataLength:(unsigned int)dataLength param:(ZegoAudioFrameParam *)param timestamp:(double)timestamp; {
    if(_isRunning){
        // data: The PCM raw data.
        // length: The data length.
        // dataType: The number of data bits collected. 0 indicates 16 bits, and 1 indicates 8 bits.
        // timeStamp: The timestamp from the collection start time to the current time.
        // sendAudioData is a parent method. The data is transparently transmitted to the Avatar SDK. If the data provided by RTC is 8 bits, the value of dataType is 1.
        [super sendAudioData: (void*)data  size:dataLength dataType: 1 /* 8-bit data from RTC*/ timeStamp: [super getDurationMs] /*This is a parent method and can be called directly.*/];
    }
}
@end
Page Directory