API | Description |
version | The SDK version number |
Get the device level. |
API | Description |
Initialization API | |
Initialization API | |
Configuration of various beauty filter effects (Added in 3.5.0.2) | |
Setting post-processing blur strength (Applies to all blur components) | |
Sets the render size. | |
Releases resources. | |
Image data processing interface: inputs image before beautification, returns image after beautification. | |
Image data processing interface: inputs image before beautification, returns image after beautification. This interface has two additional parameters compared to the previous one. | |
Export the current image. After processing the beauty effects using the interfaces process and process:withOrigin:withOrientation:, you can use this interface to obtain the current image. | |
Processes an image. | |
Gets effect information. | |
Registers a log listener. | |
Register a listener for SDK events. | |
Removes listeners. | |
Gets the current OpenGL context. | |
Pauses the SDK. | |
Resumes the SDK. | |
Whether to turn on mute when the dynamic effect material is used (new in V2.5.0)Parameters: YES means mute, NO means no mute | |
Set the enablement or disablement of a specific feature. | |
Set up synchronized video frame processing. | |
If you want to overlay a certain animation/beauty/segmentation material on the current material, when setting up the material, set 'mergeWithCurrentMotion' to true in the dictionary of 'withExtraInfo' | |
EffectMode |
typedef NS_ENUM(NSInteger, DeviceLevel) {DEVICE_LEVEL_VERY_LOW = 1,DEVICE_LEVEL_LOW = 2,DEVICE_LEVEL_MIDDLE = 3,DEVICE_LEVEL_MIDDLE_HIGH = 4,DEVICE_LEVEL_HIGH = 5};+ (DeviceLevel)getDeviceLevel;
- (instancetype _Nonnull)initWithRenderSize:(CGSize)renderSizeassetsDict:(NSDictionary* _Nullable)assetsDict;
Parameter | Description |
renderSize | The render size. |
assetsDict | The resource dictionary. |
- (instancetype _Nonnull)initWithGlTexture:(unsigned)textureIDwidth:(int)widthheight:(int)heightflipY:(bool)flipYassetsDict:(NSDictionary* _Nullable)assetsDict;
Parameter | Description |
textureID | The texture ID. |
width | The render size. |
height | The render size. |
flipY | Whether to flip the image. |
assetsDict | The resource dictionary. |
- (void)setEffect:(NSString * _Nullable)effectNameeffectValue:(int)effectValueresourcePath:(NSString * _Nullable)resourcePathextraInfo:(NSDictionary * _Nullable)extraInfo;
Parameter | Meaning |
effectName | Effect type. |
effectValue | Effect value. |
resourcePath | Material path. |
extraInfo | Reserved for expansion and additional configuration. |
- (void)emitBlurStrengthEvent:(int)strength;
Parameter | Meaning |
strength | Effect value. |
- (void)setRenderSize:(CGSize)size;
Parameter | Description |
size | The render size. |
- (void)deinit;
YTImagePixelData,YTTextureData,YTImageRawData,YTUIImageData, outputting the corresponding data formats. The pixel format in YTImagePixelData is RGBA, and the texture format in YTTextureData is OpenGL 2D./// @brief Process input 4 choose 1@interface YTProcessInput : NSObject/// Camera data object@property (nonatomic, strong) YTImagePixelData * _Nullable pixelData;/// Texture object@property (nonatomic, strong) YTTextureData * _Nullable textureData;/// Raw data object@property (nonatomic, strong) YTImageRawData * _Nullable rawData;/// UIImage object@property (nonatomic, strong) YTUIImageData * _Nullable UIImageData;/// Input data type@property (nonatomic) enum YTProcessDataType dataType;@end/// @brief Process output@interface YTProcessOutput : NSObject/// Texture output object (always guaranteed)@property (nonatomic, strong) YTTextureData * _Nullable textureData;/// Camera output object (if the input is camera acquisition data)@property (nonatomic, strong) YTImagePixelData * _Nullable pixelData;/// Raw output object (if the input is raw data)@property (nonatomic, strong) YTImageRawData * _Nullable rawData;/// UIImage output object (if the input is a UIImage object)@property (nonatomic, strong) YTUIImageData * _Nullable UIImageData;/// Output data type@property (nonatomic) enum YTProcessDataType dataType;@end- (YTProcessOutput* _Nonnull)process:(YTProcessInput * _Nonnull)input;
Parameter | Meaning |
input | Input data processing information, one of four input formats can be chosen from (YTImagePixelData, YTTextureData, YTImageRawData, YTUIImageData). |
Type | Meaning |
YTImagePixelData | Camera data object, with the pixel format, is RGBA. |
YTTextureData | Texture object, with the texture format, is OpenGL 2D. |
YTImageRawData | Raw data object. |
YTUIImageData | UIImage object. |
- (YTProcessOutput* _Nonnull)process:(YTProcessInput* _Nonnull)input withOrigin:(YtLightImageOrigin)origin withOrientation:(YtLightDeviceCameraOrientation)orientation;
Parameter | Meaning |
input | Input data processing information. |
withOrigin | Enumeration value (YtLightImageOriginTopLeft and YtLightImageOriginBottomLeft), when set to YtLightImageOriginBottomLeft, the image is flipped vertically. |
withOrientation | Enumeration value: Image rotation angle, setting the angle will change the output image angle. |
/// Export a picture of the current image- (void)exportCurrentTexture:(nullable void (^)(UIImage *_Nullable image))callback;
/// @param context If you are using the OpenGL interface of this class, we recommend using this method to initialize. You can pass [xMgiac getCurrentGlContext].- (instancetype)initWithEAGLContext:(EAGLContext *)context;
Parameter | Meaning |
context | Using the context environment of OpenGLES, you can pass [xMagic getCurrentGlContext]. |
/// @brief CVPixelBufferRef yuv/rgb interconversion interface, currently, only supports converting the three types within TEPixelFormatType. CVPixelBufferRef yuv/rgb transfor interface/// @param pixelBuffer Input pixelBuffer input pixelBuffer/// @param outputFormat Specify the type of output pixelBuffer output pixelBuffer format- (CVPixelBufferRef)transformCVPixelBufferToBuffer:(CVPixelBufferRef)pixelBuffer outputFormat:(TEPixelFormatType)outputFormat;
Parameter | Meaning |
pixelBuffer | Input pixelBuffer data |
outputFormat | Output pixelBuffer format, supports BGRA, NV12F(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange), and NV12V(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange). |
/// Convert yuv/rgb pixelBuffer to bgra format texture id/// @param pixelBuffer Input pixelBuffer- (GLuint)transformPixelBufferToBGRATexture:(CVPixelBufferRef)pixelBuffer;
Parameter | Meaning |
pixelBuffer | Input pixelBuffer data, supports BGRA, NV12F(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange), and NV12V(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange). |
/// Rotate the CVPixelBufferRef and flip it. If you pass rotation and flipping at the same time, the processing logic is to mirror first and then rotate.- (CVPixelBufferRef)convertCVPixelBuffer:(CVPixelBufferRef)pixelBuffer rotaion:(YtLightDeviceCameraOrientation)rotation flip:(TEFlipType)flipType;
Parameter | Meaning |
pixelBuffer | Input pixelBuffer data |
rotation | The counterclockwise rotation angle, supports 0 degree, 90 degrees, 180 degrees, and 270 degrees. |
flipType | Mirror type, horizontal mirroring, or vertical mirroring. If both rotation and flipping are passed, the processing logic is to mirror first and then rotate. |
/// Rotate/flip the texture Id, if both rotation and flipping are passed, the processing logic is to mirror first and then rotate.- (GLuint)convert:(GLuint)srcId width:(int)width height:(int)height rotaion:(YtLightDeviceCameraOrientation)rotation flip:(TEFlipType)flipType;
Parameter | Meaning |
srcId | Input Texture ID. |
width | Texture width. |
height | Texture height. |
rotation | The Counterclockwise rotation angle, supports 0 degree, 90 degrees, 180 degrees, and 270 degrees. |
flipType | Mirror type, horizontal mirroring, or vertical mirroring. If both rotation and flipping are passed, the processing logic is to mirror first and then rotate. |
- (UIImage* _Nullable)processUIImage:(UIImage* _Nonnull)inputImage needReset:(bool)needReset;
Parameter | Description |
inputImage | The input image. If your image is larger than 2160 x 4096, we recommend you reduce its size before passing it in; otherwise, face recognition may fail or may be inaccurate. It may also cause an OOM error. |
needReset | This parameter must be set to true in the following cases: The image processed is changed The first time a keying effect is used The first time an animated effect is used The first time a makeup effect is used |
- (YTBeautyPropertyInfo * _Nullable)getConfigPropertyWithName:(NSString *_Nonnull)propertyName;
Parameter | Description |
propertyName | The effect name. |
- (void)registerLoggerListener:(id<YTSDKLogListener> _Nullable)listener withDefaultLevel:(YtSDKLoggerLevel)level;
Parameter | Description |
listener | The log callback. |
level | The log output level, which is ERROR by default. |
- (void)registerSDKEventListener:(id<YTSDKEventListener> _Nullable)listener;
Parameter | Description |
listener | The listener for SDK events, including AI events, tips, and resource events. |
- (void)clearListeners;
- (nullable EAGLContext*)getCurrentGlContext;
/// @brief When your app is switched to the background, you need to call this API to pause the SDK- (void)onPause;
/// @brief When your app is switched back to the foreground, you need to call this API to resume the SDK- (void)onResume;
/// @brief set mute- (void)setAudioMute:(BOOL)isMute;
/// @brief Set a feature on or off/// @param featureName TEDefine FeatureName/// @param enable on or off- (void)setFeatureEnableDisable:(NSString *_Nonnull)featureName enable:(BOOL)enable;
Parameter | Description |
featureName | Name of the atomic capability Valid values: XmagicConstant.FeatureName.SEGMENTATION_SKIN Skin Segmentation Capability, after enabling, can make the skin smoothing and whitening areas more accurate.XmagicConstant.FeatureName.SEGMENTATION_FACE_BLOCK Face Occlusion Detection Capability, after enabling, can avoid makeup being applied to occluded areas.XmagicConstant.FeatureName.WHITEN_ONLY_SKIN_AREA Whitening only works on skinXmagicConstant.FeatureName.SMART_BEAUTY Intelligent Beauty (reduces beauty and makeup effects for men and babies)XmagicConstant.FeatureName.ANIMOJI_52_EXPRESSION Face Expression CapabilityXmagicConstant.FeatureName.BODY_3D_POINT Body Point CapabilityXmagicConstant.FeatureName.HAND_DETECT Gesture Detection Capability |
enable | true indicates to enable this capability, false indicates to disable this capability |
syncFrameCount frames to meet specific requirements in certain scenarios. For example, before processing the first frame, calling this interface allows the SDK to process several frames synchronously, which can prevent unfiltered images from being displayed. However, this may increase the black screen duration before the first frame is rendered, so please use it as needed.- (void)setSyncMode:(BOOL)isSync syncFrameCount:(int)syncFrameCount;
Parameter | Meaning |
isSync | Whether to process image frames synchronously. |
syncFrameCount | The number of frames to be processed synchronously. The value must be >= 0. If set to -1, it indicates an unlimited number of frames. |
API | Description |
The SDK event callback. | |
The log callback. |
@protocol YTSDKEventListener <NSObject>
Return Type | Callback |
void | |
void | |
void |
/// @param event Callback in dict format- (void)onAIEvent:(id _Nonnull)event;
{"face_info":[{"trace_id":5,"face_256_point":[180.0,112.2,...],"face_256_visible":[0.85,...],"out_of_screen":true,"left_eye_high_vis_ratio":1.0,"right_eye_high_vis_ratio":1.0,"left_eyebrow_high_vis_ratio":1.0,"right_eyebrow_high_vis_ratio":1.0,"mouth_high_vis_ratio":1.0},...]}
Field | Type | Value Range | Remarks |
trace_id | int | [1,INF) | The face ID. If the faces obtained continuously from a video stream have the same face ID, they belong to the same person. |
face_256_point | float | [0,screenWidth] or [0,screenHeight] | 512 values in total for 256 facial keypoints. (0,0) is the top-left corner of the screen. |
face_256_visible | float | [0,1] | The visibility of the 256 facial keypoints. |
out_of_screen | bool | true/false | Whether only part of the face is captured. |
left_eye_high_vis_ratio | float | [0,1] | The percentage of keypoints with high visibility for the left eye. |
right_eye_high_vis_ratio | float | [0,1] | The percentage of keypoints with high visibility for the right eye. |
left_eyebrow_high_vis_ratio | float | [0,1] | The percentage of keypoints with high visibility for the left eyebrow. |
right_eyebrow_high_vis_ratio | float | [0,1] | The percentage of keypoints with high visibility for the right eyebrow. |
mouth_high_vis_ratio | float | [0,1] | The percentage of keypoints with high visibility for the mouth. |
/// @param event: Callback in dict format- (void)onAIEvent:(id _Nonnull)event;
/// @param event: Callback in dict format- (void)onTipsEvent:(id _Nonnull)event;
/// @param event: Callback in string format- (void)onAssetEvent:(id _Nonnull)event;
@protocol YTSDKLogListener <NSObject>
Return Type | API |
void | onLog |
/// @param loggerLevel: The current log level./// @param logInfo: The log information.- (void)onLog:(YtSDKLoggerLevel) loggerLevel withInfo:(NSString * _Nonnull) logInfo;
NSString *key = _xmagicUIProperty.property.Id;NSString *value = [[NSBundle mainBundle] pathForResource:@"makeupMotionRes" ofType:@"bundle"];NSDictionary* extraInfo = @{@"mergeWithCurrentMotion":@(true)};[self.beautyKitRef configPropertyWithType:@"motion" withName:key withData:[NSString stringWithFormat:@"%@",value] withExtraInfo:extraInfo];
API | Description |
Configure various beauty effects. Deprecated, it is recommended to use setEffect. | |
This API is used to get the authorization information of the effect parameter |
- (int)configPropertyWithType:(NSString *_Nonnull)propertyType withName:(NSString *_Nonnull)propertyName withData:(NSString*_Nonnull)propertyValue withExtraInfo:(id _Nullable)extraInfo;
Parameter | Description |
propertyType | The effect type. |
propertyName | The effect name. |
propertyValue | The effect value. |
extraInfo | A reserved parameter, which can be used for dictionary configuration. |
NSString *propertyType = @"beauty"; //Set the effect type, take beauty as an example.NSString *propertyName = @"beauty.whiten"; //Specify the effect name, take the skin brightening effect as an example.NSString *propertyValue = @"60"; //Set the effect value.[self.xmagicApi configPropertyWithType:propertyType withName:propertyName withData:propertyValue withExtraInfo:nil];traInfo:nil];
NSString *propertyType = @"lut"; //Set the effect type, take filter as an example.NSString *propertyName = [@"lut.bundle/" stringByAppendingPathComponent:@"xindong_lf.png"]; //Specify the effect name, take the Allure filter as an example.NSString *propertyValue = @"60"; //Set the effect value.[self.xmagicApi configPropertyWithType:propertyType withName:propertyName withData:propertyValue withExtraInfo:nil];
NSString *propertyType = @"body"; //Set the effect type, take body retouch as an example.NSString *propertyName = @"body.legStretch"; //Specify the effect name, take the long leg effect as an example.NSString *propertyValue = @"60"; //Set the effect value.[self.xmagicApi configPropertyWithType:propertyType withName:propertyName withData:propertyValue withExtraInfo:nil];
NSString *motion2dResPath = [[NSBundle mainBundle] pathForResource:@"2dMotionRes" ofType:@"bundle"];//The absolute path of the `2dMotionRes` folderNSString *propertyType = @"motion"; //Set the effect type, take animated effect as an example.NSString *propertyName = @"video_keaituya"; //Specify the effect name, take animated 2D cute effect as an example.NSString *propertyValue = motion2dResPath; //Set the path of the animated effect.[self.xmagicApi configPropertyWithType:propertyType withName:propertyName withData:propertyValue withExtraInfo:nil];
NSString *motionMakeupResPath = [[NSBundle mainBundle] pathForResource:@"makeupMotionRes" ofType:@"bundle"];//The absolute path of the `makeupMotionRes` folderNSString *propertyType = @"motion"; //Set the effect type, take makeup as an example.NSString *propertyName = @"video_nvtuanzhuang"; //Specify the effect name, take the girl group makeup effect as an example.NSString *propertyValue = motionMakeupResPath; //Set the path of the animated effect.[self.xmagicApi configPropertyWithType:propertyType withName:propertyName withData:propertyValue withExtraInfo:nil];//Below are settings for the makeup effect (you only need to configure the above parameters once and can change the following settings multiple times).NSString *propertyTypeMakeup = @"custom"; //Set the effect type, take makeup as an example.NSString *propertyNameMakeup = @"makeup.strength"; //Specify the effect name, take the girl group makeup effect as an example.NSString *propertyValueMakeup = @"60"; //Set the effect value.[self.xmagicApi configPropertyWithType:propertyTypeMakeup withName:propertyNameMakeup withData:propertyValueMakeup withExtraInfo:nil];
NSString *motionSegResPath = [[NSBundle mainBundle] pathForResource:@"segmentMotionRes" ofType:@"bundle"];//The absolute path of thesegmentMotionResfolderNSString *propertyType = @"motion"; //Set the effect type, take keying as an example.NSString *propertyName = @"video_segmentation_blur_75"; //Specify the effect name, take the background blurring - strong as an example.NSString *propertyValue = motionSegResPath; //Set the path of the animated effect.NSDictionary *dic = @{@"bgName":@"BgSegmentation.bg.png", @"bgType":@0, @"timeOffset": @0},@"icon":@"segmentation.linjian.png"};//Configure the reserved parameter.[self.xmagicApi configPropertyWithType:propertyType withName:propertyName withData:propertyValue withExtraInfo:dic];
NSString *motionSegResPath = [[NSBundle mainBundle] pathForResource:@"segmentMotionRes" ofType:@"bundle"];//The absolute path of the `segmentMotionRes` folderNSString *propertyType = @"motion"; //Set the effect type, take keying as an exampleNSString *propertyName = @"video_empty_segmentation"; //Specify the effect name, take custom background as an exampleNSString *propertyValue = motionSegResPath; //Set the path of the animated effectNSString *imagePath = @"/var/mobile/Containers/Data/Application/06B00BBC-9060-450F-8D3A-F6028D185682/Documents/MediaFile/image.png"; //The absolute path of the background image or video (after compression).int bgType = 0;//The background type. 0: image; 1: video.int timeOffset = 0;//The duration. If an image is used as the background, its value is 0; if a video is used, its value is the video length.NSDictionary *dic = @{@"bgName":imagePath, @"bgType":@(bgType), @"timeOffset": @(timeOffset)},@"icon":@"segmentation.linjian.png"};//Configure the reserved parameter.[self.xmagicApi configPropertyWithType:propertyType withName:propertyName withData:propertyValue withExtraInfo:dic];
NSDictionary *assetsDict = @{@"core_name":@"LightCore.bundle",@"root_path":[[NSBundle mainBundle] bundlePath],@"setDowngradePerformance":@(YES) //Enable high performance mode.};self.beautyKit = [[XMagic alloc] initWithRenderSize:previewSize assetsDict:assetsDict];
/// @brief Activate the beautification enhancing pattern- (void)enableEnhancedMode;
Beauty Item Name | In enhanced pattern, recommended maximum value (magnification factor) |
Whitening, shortening the face, V-face, eye distance, nose position, removal of laugh lines, lipstick, three-dimensional appearance | 1.3 |
Eye lightening | 1.5 |
Blush | 1.8 |
Other | 1.2 |
| |
| |
| |
/// @param featureId: The effect parameter./// @return: The authorization information of the effect parameter.+ (BOOL)isBeautyAuthorized:(NSString * _Nullable)featureId;setsert
Feedback