System | iOS 15.6 or later |
Processor Architecture | arm64 CPU Architecture |
Support Terminal Device | A12+ CPU, M1+ CPU iPhone XS, iPhone XR, iPhone SE2, iPhone 11+ and newer devices iPad 8+, iPad Mini 5+, iPad Air 3+, iPad Pro 3+, New iPad Pro 1+ and newer devices (iPhones released after 2018, iPad devices released after 2019) |
Development IDE | Xcode |
Memory Requirements | More than 500 MB |
common_model/. If there are multiple character models, they can share a Base Model package.human_xxxxx_540p_v3/, where xxxxx is the name of the custom avatar.

#import <TencentVirtualHumanSDK/TencentVirtualHumanSDK.h>
TVHLogLevelOff.[[TVHLogManager shareInstance] setLogLevel:TVHLogLevelInfo];
TVHLogManager.shareInstance().setLogLevel(TVHLogLevelInfo)
Log level | Severity Flag |
Disable logs | TVHLogLevelOff |
Error logs | TVHLogLevelError |
Warning logs | TVHLogLevelWarn |
Log information | TVHLogLevelInfo |
Debug Logs | TVHLogLevelDebug |
Trace Logs | TVHLogLevelTrace |
int result = [[TVHLicenseManager shareInstance] authWithAppID:@"0000000000" andSecretKey:@"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"];NSLog(@"license result: %d", result);
let result = TVHLicenseManager.shareInstance().auth(withAppID: "0000000000", andSecretKey: "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx");print("license result: \\(result)")
// Place the general model and avatar model under the Documents directory in the sandbox firstNSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);NSString *documentsPath = [paths firstObject];NSString* common_model = [NSString stringWithFormat:@"%@/common_model", documentsPath];NSString* human_model = [NSString stringWithFormat:@"%@/human_yunxi_540p_v3", documentsPath];// Initialize the controllerself.controller = [[TVHController alloc] initWithCommonModelPath:common_model humanModelPath:human_model];self.controller.delegate = self;// Start rendering[self.controller start];
// Place the general model and avatar model under the Documents directory in the sandbox firstlet paths = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)let documentsPath = paths.first!let commonModel = "\\(documentsPath)/common_model"let humanModel = "\\(documentsPath)/human_youyou3_720p"// Initialize the controllercontroller = TVHController(commonModelPath: commonModel, humanModelPath: humanModel)if(controller != nil) {controller.delegate = self;// Start renderingcontroller.start()}
RenderView. You can set the size and position of this RenderView the same as an ordinary View, and can add this RenderView as a subview to other views.TVHRenderViewDelegate to obtain RGBA original data for drawing.TVHRenderView to get the best rendering effectself.renderView = [[TVHRenderView alloc] initWithFrame:self.view.bounds];self.renderView.fillMode =TVHMetalViewContentModeFit;[self.view addSubview:self.renderView];// Set renderView to the controllerself.controller.renderView = self.renderView;
renderView = TVHRenderView(frame: self.view.bounds)renderView.fillMode = .fitview.addSubview(renderView)// Set renderView to the controllercontroller.renderView = renderView
TVHMetalViewContentModeStretch | Stretch image |
TVHMetalViewContentModeFit | Crop image |
TVHMetalViewContentModeFill | Reserve black border (transparent edge) |
appendAudioData multiple times to stream audio data. The input audio will be saved to the buffer queue. The duration of the audio data in each call can be any value.// This code snippet demonstrates reading PCM data from a file, simulating streaming segmentation and sending to the controllerNSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);NSString *documentsPath = [paths firstObject];NSString* pcmPath = [NSString stringWithFormat:@"%@/test.pcm", documentsPath];// Read audio dataNSData* data = [NSData dataWithContentsOfFile:pcmPath];// Simulate streaming, send data in fragments (for demo only, the last fragment is dropped)int packageSize = 1280;int packageCount =(int)[data length] / packageSize;for(int i=0; i<packageCount; i++) {BOOL isFinal = (i == packageCount -1);[self.controller appendAudioData:[data subdataWithRange:NSMakeRange(i*packageSize, packageSize)] metaData:@"" isFinal:isFinal];}
// This code snippet demonstrates reading PCM data from a file, simulating streaming segmentation and sending to the controllerlet paths = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)guard let documentsPath = paths.first else { return }let pcmPath = "\\(documentsPath)/test.pcm"// Read audio dataguard let data = try? Data(contentsOf: URL(fileURLWithPath: pcmPath)) else { return }// Simulate streaming, send data in fragments (for demo only, the last fragment is dropped)let packageSize = 1280let packageCount = data.count / packageSizefor i in 0..<packageCount {let range = i * packageSize..<(i * packageSize + packageSize)let subData = data.subdata(in: range)let isFinal = (i == packageCount - 1)controller.appendAudioData(subData, metaData: "", isFinal: isFinal)}
interrupt method of the controller. The Digital Human will immediately stop speaking and disable audio playback.[self.controllerinterrupt]
controller.interrupt()
speakFinish message.speakFinish message. Before receiving the speakFinish message, calling appendAudioData will also wait until speakFinish before resuming broadcast.pause and resume methods to control Digital Human suspension and resumption.NSNotificationCenter to register a listener and obtain messages when the app enters the background or foreground.@implementation DemoViewController- (void)viewDidLoad {[super viewDidLoad];// ... Other code// Register notification[[NSNotificationCenter defaultCenter] addObserver:selfselector:@selector(appDidEnterBackground:)name:UIApplicationDidEnterBackgroundNotificationobject:nil];[[NSNotificationCenter defaultCenter] addObserver:selfselector:@selector(appWillEnterForeground:)name:UIApplicationWillEnterForegroundNotificationobject:nil];}- (void)dealloc{// Cancel authorization upon termination[[NSNotificationCenter defaultCenter] removeObserver:self];}- (void)appDidEnterBackground:(NSNotification *)notification {// Handle suspension[self.controller pause];}- (void)appWillEnterForeground:(NSNotification *)notification {// Handle recovery[self.controller resume];}@end
class DemoViewController2: UIViewController {override func viewDidLoad() {super.viewDidLoad()// ... Other code// Device registration notificationNotificationCenter.default.addObserver(self,selector: #selector(appDidEnterBackground(_:)),name: UIApplication.didEnterBackgroundNotification,object: nil)NotificationCenter.default.addObserver(self,selector: #selector(appWillEnterForeground(_:)),name: UIApplication.willEnterForegroundNotification,object: nil)}deinit {// Cancel authorization upon terminationNotificationCenter.default.removeObserver(self)}@objc private func appDidEnterBackground(_ notification: Notification) {// Handle suspensioncontroller.pause()}@objc private func appWillEnterForeground(_ notification: Notification) {// Handle recoverycontroller.resume()}}
TVHControllerDelegate delegate, register the delegate object in the controller, and you can receive notification messages.// Implement TVHControllerDelegate@interface DemoViewController () <TVHControllerDelegate>// ... Other code@end@implementation DemoViewController// Render start- (void)renderStart {NSLog(@"render start");}// Broadcast start- (void)speakStart {NSLog(@"speak start");}// Broadcast done- (void)speakFinish {NSLog(@"speak finish");}// Broadcast error- (void)speakError:(NSInteger)errorCode message:(NSString*)message {NSLog(@"speak error code:%ld, message:%@", errorCode, message);}@end
// Implement TVHControllerDelegateextension DemoViewController2 : TVHControllerDelegate {// Render startnonisolated func renderStart() {print("render start");}// Broadcast startnonisolated func speakStart() {print("speak start");}// Broadcast donenonisolated func speakFinish() {print("speak finish");}// Broadcast errornonisolated func speakError(_ errorCode: Int, message: String!) {print("speak error code:\\(errorCode), message:\\(message)");}}
[self.controller appendAudioData:data metaData:@"Here is additional information" isFinal:isFinal];
controller.appendAudioData(data, metaData: "Here is additional information", isFinal: isFinal)
// Implement TVHControllerDelegate@interface DemoViewController () <TVHControllerDelegate>// ... Other code@end@implementation DemoViewController// Broadcast Meta information start- (void)speakMetaStart:(NSString *)metaData {NSLog(@"speak meta start: %@", metaData);}// Broadcast Meta information done- (void)speakMetaFinish:(NSString *)metaData {NSLog(@"speak meta finish: %@", metaData);}@end
// Implement TVHControllerDelegateextension DemoViewController2 : TVHControllerDelegate {// Broadcast Meta information startnonisolated func speakMetaStart(_ metaData: String!) {print("speak meta start: \\(metaData)");}// Broadcast Meta information donenonisolated func speakMetaFinish(_ metaData: String!) {print("speak meta finish: \\(metaData)");}}
canSmoothlyRun method to check whether the current device supports the rendering SDK.BOOL canRun = [[TVHDeviceManager shareInstance] canSmoothlyRun]
let canRun = TVHDeviceManager.shareInstance().canSmoothlyRun()




Feedback