tencent cloud

Cloud Object Storage

Release Notes and Announcements
Release Notes
Announcements
Product Introduction
Overview
Features
Use Cases
Strengths
Concepts
Regions and Access Endpoints
Specifications and Limits
Service Regions and Service Providers
Billing
Billing Overview
Billing Method
Billable Items
Free Tier
Billing Examples
Viewing and Downloading Bill
Payment Overdue
FAQs
Getting Started
Console
Getting Started with COSBrowser
User Guide
Creating Request
Bucket
Object
Data Management
Batch Operation
Global Acceleration
Monitoring and Alarms
Operations Center
Data Processing
Content Moderation
Smart Toolbox
Data Processing Workflow
Application Integration
User Tools
Tool Overview
Installation and Configuration of Environment
COSBrowser
COSCLI (Beta)
COSCMD
COS Migration
FTP Server
Hadoop
COSDistCp
HDFS TO COS
GooseFS-Lite
Online Tools
Diagnostic Tool
Use Cases
Overview
Access Control and Permission Management
Performance Optimization
Accessing COS with AWS S3 SDK
Data Disaster Recovery and Backup
Domain Name Management Practice
Image Processing
Audio/Video Practices
Workflow
Direct Data Upload
Content Moderation
Data Security
Data Verification
Big Data Practice
COS Cost Optimization Solutions
Using COS in the Third-party Applications
Migration Guide
Migrating Local Data to COS
Migrating Data from Third-Party Cloud Storage Service to COS
Migrating Data from URL to COS
Migrating Data Within COS
Migrating Data Between HDFS and COS
Data Lake Storage
Cloud Native Datalake Storage
Metadata Accelerator
GooseFS
Data Processing
Data Processing Overview
Image Processing
Media Processing
Content Moderation
File Processing Service
File Preview
Troubleshooting
Obtaining RequestId
Slow Upload over Public Network
403 Error for COS Access
Resource Access Error
POST Object Common Exceptions
API Documentation
Introduction
Common Request Headers
Common Response Headers
Error Codes
Request Signature
Action List
Service APIs
Bucket APIs
Object APIs
Batch Operation APIs
Data Processing APIs
Job and Workflow
Content Moderation APIs
Cloud Antivirus API
SDK Documentation
SDK Overview
Preparations
Android SDK
C SDK
C++ SDK
.NET(C#) SDK
Flutter SDK
Go SDK
iOS SDK
Java SDK
JavaScript SDK
Node.js SDK
PHP SDK
Python SDK
React Native SDK
Mini Program SDK
Error Codes
Harmony SDK
Endpoint SDK Quality Optimization
Security and Compliance
Data Disaster Recovery
Data Security
Cloud Access Management
FAQs
Popular Questions
General
Billing
Domain Name Compliance Issues
Bucket Configuration
Domain Names and CDN
Object Operations
Logging and Monitoring
Permission Management
Data Processing
Data Security
Pre-signed URL Issues
SDKs
Tools
APIs
Agreements
Service Level Agreement
Privacy Policy
Data Processing And Security Agreement
Contact Us
Glossary

Setting Log Analysis

PDF
Mode fokus
Ukuran font
Terakhir diperbarui: 2025-06-25 16:59:40

Overview

If COS log storage is enabled, you can use the log analysis feature to further analyze the generated log files. This feature consolidates log files within a specified time range for statistical analysis and extracts key metrics for your reference.

Prerequisites

Log storage feature is enabled. If not enabled, please follow the Set Log Storage guide to enable it.
Note:
The applicable regions for the COS Log Analysis Function depend on the log storage feature. Please first enable the log storage feature before using the log analysis function.

Directions

1. Log in to the COS console.
2. Choose Bucket List on the left sidebar.
3. Find the bucket that needs log analysis. Click the bucket name to go to the bucket management page.
4. On the left sidebar, select Log Management > Logging.
Note:
To use the log analysis feature, first enable log storage as instructed in Setting Logging.
5. With the log storage service enabled, the activation page for log analysis will appear. Click Use Now, and the system will detect whether you have added a COS log analysis function. A pop-up will guide you to create a COS log analysis function. For configuration instructions, please refer to Add COS Log Analysis Function.
6. Select a corresponding function and time range. Click New Analysis Task and configure the following information in the pop-up window:
Source Bucket: The default is the current bucket.
Time Range: Period of logs you want to analyze, up to 30 days. Logs are retrieved according to end time. Logs in the specified time range cannot exceed 200 GB.
Cloud Function: Select a COS log analysis function added in the region where this bucket resides.
Product Destination Directory: After analysis, the output result will be zipped and saved to the directory you set. The output file contains result file and inventory file. Result file is based on the scenario you select. Inventory file refers to the log file list retrieved for this analysis.
7. Click Next, enter the detailed configuration page. Configuration instructions are as follows.
Task Type: Supports statistical analysis and retrieval operation logs.
Statistical analysis: Perform statistical analysis in scenarios that support object download.
Search operation logs: Support search related logs by feature operations
Statistical Analysis
Search operation logs
Scene categories: Currently only support object download scenarios.
Select scene: Currently supported log analysis scenarios. For example, analyze the top N IPs and top N files with the highest download traffic, the top N IPs, top N files, top N HTTP referers, and top N UserAgents with the most download counts.
Value of N: Corresponding N value in the scenario. Enter a positive integer.
Job Description: Customize the description for this analysis.
Scene categories: Support bucket and object-related scenarios.
Select Scene
Bucket-related: Including static website-related, origin-pull settings-related, lifecycle-related, tag management-related, inventory setting-related, cross-domain configuration-related, anti-hotlinking setting-related, server encryption-related, bucket ACL-related, Policy permission setting-related, global acceleration domain name setting-related, version control-related, cross-bucket replication-related, log storage-related.
Operation Filtering: Supports filtering read and write operations.
Object-related: Supports file upload (single file upload, multi-part upload, file replication), file download, single file deletion, and batch file deletion scenarios.
Filter Condition: Supports specifying object keys (not supported for batch file deletion), UserAgent, and source IP filter conditions.
Filter Mode: Supports fuzzy matching and exact matching.
Job Description: Customize the description for this analysis.
8. After confirming that the configuration is correct, click OK, and you will see that the analysis task has been added successfully. If the configuration is incorrect, follow the prompt to complete the operation.
Click View Result to view the result of this analysis task and where the result files are saved.
Note:
You can query log analysis tasks created in the last 3 days.
You can view the result only after a log analysis task is finished. A task takes a few to tens of minutes depending on the log size.
Click Running Log to go to the SCF console and view logs of COS log analysis.

Bantuan dan Dukungan

Apakah halaman ini membantu?

masukan