tencent cloud

Cloud Object Storage

Release Notes and Announcements
Release Notes
Announcements
Product Introduction
Overview
Features
Use Cases
Strengths
Concepts
Regions and Access Endpoints
Specifications and Limits
Service Regions and Service Providers
Billing
Billing Overview
Billing Method
Billable Items
Free Tier
Billing Examples
Viewing and Downloading Bill
Payment Overdue
FAQs
Getting Started
Console
Getting Started with COSBrowser
User Guide
Creating Request
Bucket
Object
Data Management
Batch Operation
Global Acceleration
Monitoring and Alarms
Operations Center
Data Processing
Content Moderation
Smart Toolbox
Data Processing Workflow
Application Integration
User Tools
Tool Overview
Installation and Configuration of Environment
COSBrowser
COSCLI (Beta)
COSCMD
COS Migration
FTP Server
Hadoop
COSDistCp
HDFS TO COS
GooseFS-Lite
Online Tools
Diagnostic Tool
Use Cases
Overview
Access Control and Permission Management
Performance Optimization
Accessing COS with AWS S3 SDK
Data Disaster Recovery and Backup
Domain Name Management Practice
Image Processing
Audio/Video Practices
Workflow
Direct Data Upload
Content Moderation
Data Security
Data Verification
Big Data Practice
COS Cost Optimization Solutions
Using COS in the Third-party Applications
Migration Guide
Migrating Local Data to COS
Migrating Data from Third-Party Cloud Storage Service to COS
Migrating Data from URL to COS
Migrating Data Within COS
Migrating Data Between HDFS and COS
Data Lake Storage
Cloud Native Datalake Storage
Metadata Accelerator
GooseFS
Data Processing
Data Processing Overview
Image Processing
Media Processing
Content Moderation
File Processing Service
File Preview
Troubleshooting
Obtaining RequestId
Slow Upload over Public Network
403 Error for COS Access
Resource Access Error
POST Object Common Exceptions
API Documentation
Introduction
Common Request Headers
Common Response Headers
Error Codes
Request Signature
Action List
Service APIs
Bucket APIs
Object APIs
Batch Operation APIs
Data Processing APIs
Job and Workflow
Content Moderation APIs
Cloud Antivirus API
SDK Documentation
SDK Overview
Preparations
Android SDK
C SDK
C++ SDK
.NET(C#) SDK
Flutter SDK
Go SDK
iOS SDK
Java SDK
JavaScript SDK
Node.js SDK
PHP SDK
Python SDK
React Native SDK
Mini Program SDK
Error Codes
Harmony SDK
Endpoint SDK Quality Optimization
Security and Compliance
Data Disaster Recovery
Data Security
Cloud Access Management
FAQs
Popular Questions
General
Billing
Domain Name Compliance Issues
Bucket Configuration
Domain Names and CDN
Object Operations
Logging and Monitoring
Permission Management
Data Processing
Data Security
Pre-signed URL Issues
SDKs
Tools
APIs
Agreements
Service Level Agreement
Privacy Policy
Data Processing And Security Agreement
Contact Us
Glossary

Setting Document Moderation

PDF
聚焦模式
字号
最后更新时间: 2024-01-06 15:31:10

Overview

This document describes how to use the document moderation feature in the console to check file content for pornographic, illegal, and advertising information.
After you configure automatic document moderation, new documents uploaded to a bucket will be automatically moderated, and the identified non-compliant content can be automatically blocked (by denying public read access to the content).
You can also moderate existing documents stored in COS. For more information, see Document Moderation.
Note:
The document moderation feature leverages the document conversion capability to convert each page of a document into an image for moderation.
Document moderation is billed by CI.
Currently, document types supported for moderation include:
Presentation files: PPTX, PPT, POT, POTX, PPS, PPSX, DPS, DPT, PPTM, POTM, PPSM.
Text files: DOC, DOT, WPS, WPT, DOCX, DOTX, DOCM, DOTM.
Spreadsheet files: XLS, XLT, ET, ETT, XLSX, XLTX, CSV, XLSB, XLSM, XLTM, ETS.
PDF.
Other files: TXT, LOG, HTM, HTML, LRC, C, CPP, H, ASM, S, JAVA, ASP, BAT, BAS, PRG, CMD, RTF, XML.
A spreadsheet file may be split into multiple pages, with multiple images generated.
The input file size cannot exceed 200 MB.
The number of pages in the input file cannot exceed 5,000.

Flowchart





Directions

1. Log in to the COS console.
2. On the Bucket List page, click the target bucket to enter the bucket details page.
3. On the left sidebar, select Sensitive Content Moderation > Automatic Moderation Configuration and click Document Moderation.
4. Click Add Automatic Document Moderation Configuration and set the following configuration items:
Moderation Scope: Select the scope of documents to be moderated, which can be the entire bucket, a specific directory, or a specific file prefix.
Moderation Suffix: Select one or multiple options for Document Format, including presentation, text, spreadsheet, and PDF.
Moderation Policy: Select a moderation policy. You can create different policies for refined moderation. If no policies have been configured, the default policy will be used. Moderation scene options include Pornographic, Illegal, and Advertisement, and you can select one or multiple options. For more information on how to configure a moderation policy, see Setting Moderation Policy.
Moderation Scene: It displays the scene that you configure in the moderation policy. You can select the target scene as needed.
File block configuration: You can enable this service to authorize CI to perform automatic or human moderation and block the identified non-compliant files by denying public read access to them. After enabling this service, you need to select the block type and score range of files to be blocked.
Block mode: The following two block modes are supported:
Change the file ACL to private read: Doing so actually blocks the file. Then, a 403 status code will be returned when the file is accessed again, indicating that access is denied. For more information on file permissions, see ACL.
Transfer the file to the backup directory: Doing so actually blocks the file. Then, a 404 status code will be returned when the file is accessed again, indicating that the file does not exist. The backup directory is automatically generated by the backend at audit_freeze_backup/increment_audit in the current bucket.
Block Type: You can select a block type and mechanism. Machine moderation and block is selected by default. If you select Human moderation and block, Tencent Cloud security team will review suspiciously sensitive files identified during machine moderation. You can select the file score range for blocking (by specifying an integer between 60 and 100; the greater the score, the more sensitive the file).
Callback: After callback is enabled, you will receive moderation results. You need to select the moderation type and callback content and set the callback URL. For more information, see Document Moderation Callback Content.
5. After completing the configuration, click Save. Documents uploaded subsequently will be moderated.

Notes

1. Document moderation adopts a scoring mechanism, with a score between 0 and 100 returned for each output image.
2. Depending on the score range, the moderation result can be a sensitive image, suspiciously sensitive image, or normal image.
The score range of sensitive images is ≥ 91.
The score range of suspiciously sensitive images is 61–90. Such images cannot be accurately identified as sensitive, so human moderation is recommended to ensure their content security.
The score range of normal images is ≤ 60. Such images are determined as normal by the system.

帮助和支持

本页内容是否解决了您的问题?

填写满意度调查问卷,共创更好文档体验。

文档反馈