tencent cloud

Cloud Log Service

Release Notes and Announcements
Release Notes
Announcements
User Guide
Product Introduction
Overview
Features
Available Regions
Limits
Concepts
Service Regions and Service Providers
Purchase Guide
Billing Overview
Product Pricing
Pay-as-You-Go
Billing
Cleaning up CLS resources
Cost Optimization
FAQs
Getting Started
Getting Started in 1 Minute
Getting Started Guide
Quickly Trying out CLS with Demo
Operation Guide
Resource Management
Permission Management
Log Collection
Metric Collection
Log Storage
Metric Storage
Search and Analysis (Log Topic)
Search and Analysis (Metric Topic)
Dashboard
Data Processing documents
Shipping and Consumption
Monitoring Alarm
Cloud Insight
Independent DataSight console
Historical Documentation
Practical Tutorial
Log Collection
Search and Analysis
Dashboard
Monitoring Alarm
Shipping and Consumption
Cost Optimization
Developer Guide
Embedding CLS Console
CLS Connection to Grafana
API Documentation
History
Introduction
API Category
Making API Requests
Topic Management APIs
Log Set Management APIs
Index APIs
Topic Partition APIs
Machine Group APIs
Collection Configuration APIs
Log APIs
Metric APIs
Alarm Policy APIs
Data Processing APIs
Kafka Protocol Consumption APIs
CKafka Shipping Task APIs
Kafka Data Subscription APIs
COS Shipping Task APIs
SCF Delivery Task APIs
Scheduled SQL Analysis APIs
COS Data Import Task APIs
Data Types
Error Codes
FAQs
Health Check
Collection
Log Search
Others
CLS Service Level Agreement
CLS Policy
Privacy Policy
Data Processing And Security Agreement
Contact Us
Glossary

Data Processing

PDF
Focus Mode
Font Size
Last updated: 2024-12-17 17:55:46
The restrictions for data processing are shown in the table below:
Type
Restriction Item
Description
Data Processing
Number of Data Processing Tasks
One log topic can create 50 data processing tasks.
Number of Target Topics
One data processing task can have up to 1000 target topics.
The number of debugging data
By default,100 pieces of logs are loaded from the source topic for you to debug the DSL function.
Processing Capability
10 ~ 20 MB/s/partition (uncompressed data), with partition referring to the that of the source log topic.
Processing Task Latency
Up to 99.9% of the data will be processed in the data processing module within 1 second.
Impact of starting and stopping processing tasks on data
When the processing task is restarted within 12 hours after it is suspended, processing will be started from the data when it was suspended. When the processing task is restarted after 12 hours, processing will be started from the latest data.
Cross-region/Cross-account
Not supported.
Processing History Data
Not supported.
Scheduled SQL analysis
Number of scheduled SQL tasks
One log topic can be associated with 10 scheduled SQL tasks. (The log topic can be the source log topic of the task or the target log topic.)
Query Concurrency
A single log topic supports 15 concurrent queries (Scheduled SQL、 search and analysis、Alarms、Dashboards).
Viewing the query result
When the query results are statistical analysis results, 100 results will be returned at a time by default. When SQL LIMIT syntax is used, a maximum of 1 million results will be returned at a time.
The maximum size of the returned data packet of query results is 49 MB, and gzip compression can be enabled when API is used.
Query Timeout Duration
The timeout period for a single query is 55 seconds.
SQL time window
1 minute to 7 days.
Scheduling cycle
1 minute to 24 hours.
Custom timestamp
The minimum custom timestamp interval is 15 seconds. For example, 15:00:00, 15:00:15, and 15:00:30.
SQL execution delayed
The default value is 60 seconds and the maximum value is 120 seconds. It takes about 5 seconds to generate 99.9% of the log indexes, and 60 seconds in extreme cases. Therefore, set a delay for execution to ensure that the query can successfully obtain index data.
Cross-region/Cross-account
Not supported.


Help and Support

Was this page helpful?

Help us improve! Rate your documentation experience in 5 mins.

Feedback