tencent cloud

Image Moderation
Last updated: 2025-12-11 21:24:53
Image Moderation
Last updated: 2025-12-11 21:24:53

Feature Overview

The Image Auditing feature employs cutting-edge Image Analysis algorithms, trained and modeled with massive volumes of non-compliant image data, to help users effectively identify prohibited content in images. It filters content including pornography, illegal activities, advertisements, etc., achieving high recognition accuracy and recall rates while comprehensively covering Content Moderation requirements across multiple dimensions. It keeps pace with regulatory demands in real-time, continuously updating the auditing service's detection standards and capabilities to ensure all-round security for platform images.
Note:
Image Auditing is a paid service. For specific fees, see Content Moderation fees. Cloud Infinite, after the account first generates usage for this service, will issue a free quota resource pack with a usage of 100,000 times and a validity period of 2 months. Beyond the usage or after the resource pack expires, normal billing will apply.
When using Image Auditing, please first confirm the relevant limitations and Region. For details, see Usage Limitations.
After enabling the Image Auditing service, whenever new images are generated in your COS Bucket, they will be automatically detected, and automatic freezing (prohibiting public read access) of detected non-compliant content is supported.

Feature Experience

You can experience the Image Auditing feature online. Click to go to Cloud Infinite Experience Center.

Applicable Scenarios

E-commerce Platform

E-commerce applications contain vast amounts of UGC content, and the booming live-streaming shopping scenarios make them more susceptible to non-compliant content such as QR codes and MLM ads. Image Auditing comprehensively covers various violation types, effectively filtering multiple forms of advertising images in e-commerce scenarios with millisecond-level response times, ensuring a seamless user browsing experience.

Social Platform

Social platforms primarily feature user-uploaded UGC content with diverse data types. Image Auditing comprehensively covers various violation types. By configuring through the console, it automatically triggers incremental content detection with millisecond-level response times, effectively ensuring a seamless user browsing experience.

Online Education

Online education platforms primarily serve children and adolescents, subject to more stringent regulatory oversight for content compliance. There is a need for customized auditing scenarios, with broader detection categories. Image Auditing delivers exceptionally high machine recognition accuracy, safeguarding educational content security.

Video Platform

In gaming applications, there are numerous scenarios involving user-uploaded custom avatars and UGC images in gaming communities. Image Auditing provides a one-stop solution to meet multi-scenario auditing needs for gaming clients. With scenario-based customization, it leverages extensive industry-specific data for training and develops specialized models tailored for the gaming industry, delivering higher machine recognition accuracy for avatar reviews, anime content auditing, and more.

Prerequisites

You have enabled the COS service, created a Bucket, and uploaded files to the Bucket. For specific operations, see Bucket files.
You have activated the Cloud Infinite service and bound the Bucket. For specific operations, see Bucket Binding.

Usage

Automated Moderation on Upload

You can create automatic auditing rules for Buckets in the Cloud Infinite console. After creation, newly uploaded images in the Bucket will be audited during upload. For usage details, see the Image Moderation console documentation.

Call the API Audit

You can call the Image Auditing API to audit local images, third-party image URLs, and images in COS. For usage details, see the Image Moderation API documentation.

Historical Data Scan Auditing

For large batches of image data stored in COS, you can create historical data auditing tasks to perform one-time auditing on bulk images. For usage details, see Setting Historical Data Moderation Tasks.

Use API Interfaces

You can use our provided API to perform Content Moderation on images. For details, see Image Moderation API documentation.

Using SDKs

You can use our SDKs in various languages to perform Content Moderation on images. For details, see the following SDK documentation:
SDK
Connect to Documentation
Android SDK
C++ SDK
Go SDK
.NET SDK
Java SDK
JavaScript SDK
Python SDK
Mini Program SDK
Note:
The processing capabilities provided by Cloud Infinite are fully integrated with the COS SDK. You can directly utilize the COS SDK for processing operations.

View Audit Results

Callback settings: You can set the callback address, callback auditing type, callback threshold, etc. to filter callback content. The auditing results will be automatically sent to your callback address, facilitating your subsequent operations. For details on callback content, see the Image Moderation console documentation.
Visual processing: After enabling the Image Auditing feature, you can view auditing results based on specified criteria on the auditing details page in the console, and manually process the results. For usage details, see the Moderation Details console documentation.
Was this page helpful?
You can also Contact Sales or Submit a Ticket for help.
Yes
No

Feedback