

Configuration Item | Description | |
Task Name | Name of the data processing task, for example: my_transform. | |
Enabling Status | Task start/stop, default start. | |
Preprocessing Data | Turn on the switch. The feature entry for preprocessing: Entry 1: Toggle on the Preprocessing Data switch when creating a data processing task. Entry 2: You can also click Data Processing at the bottom of the Collection Configuration page to enter the preprocessing data editing page. | |
Log Topic | Specify the log topic to write pre-processing results to. | |
external data source | Add external data source, applicable to dimension table join scenarios. Currently only support Tencent Cloud MySQL, see res_rds_mysql function. Region: The region where the cloud MySQL instance is located TencentDB for MySQL instance: Please select in the pull-down menu Username: Enter your database username Password: Enter your database password Alias: Your MySQL alias name, which will be used in res_rds_mysql as parameter. Note: | |
Data processing service log | The data processing task running logs are saved in the cls_service_log log topic (free). The alarm feature in the monitoring chart depends on this log topic and is enabled by default. | |
Upload Processing Failure Logs | When enabled, logs that failed to be processed will write into the target topic. When turned off, this option will drop processing-failed logs. | |
Field Name in Processing Failure Logs | If your choice is to write failed logs to the target log topic, failure logs will be saved in this field, with a default name of ETLParseFailure. | |
Advanced Settings | Add environment variable: Add environment variables for the data processing task runtime. For example, add a pair of variables, name ENV_MYSQL_INTERVAL, value 300, then you can use refresh_interval=ENV_MYSQL_INTERVAL in the res_rds_mysql function, the task will parse into refresh_interval=300. Note: | |


Configuration Item | Description |
Task Name | Name of the data processing task, for example: my_transform. |
Enabling Status | Task start/stop, default start. |
Preprocessing Data | Turn off the switch. |
Source Log Topic | Data source of the data processing task. |
External data source | Add external data source, applicable to dimension table join scenarios. Currently only support Tencent Cloud MySQL, see res_rds_mysql function. Region: The region where the cloud MySQL instance is located. TencentDB for MySQL Instance: Please select in the pull-down menu. Username: Enter your database username. Password: Enter your database password. Alias: Your MySQL alias name, which will be used in res_rds_mysql as parameter. Note: |
Process Time Range | Specify the log scope for data processing. Note: Process only data in the lifecycle of a log topic. |
Target Log Topic | Select fixed log topic: Log topic: destination bucket for inventory output of data processing, configured as one or multiple. Target name: For example, in the source log topic, output loglevel=warning logs to Log Topic A, loglevel=error logs to Log Topic B, and loglevel=info logs to Log Topic C. You can configure the target names of Log Topic A, B, and C as warning, error, and info. |
Data processing service log | The data processing task running logs are saved in the cls_service_log log topic (free). The alarm feature in the monitoring chart depends on this log topic and is enabled by default. |
Upload Processing Failure Logs | When enabled, logs that failed to be processed will write into the target topic. When turned off, this option will drop processing-failed logs. |
Field Name in Processing Failure Logs | If your choice is to write failed logs to the target log topic, failure logs will be saved in this field, with a default name of ETLParseFailure. |
Advanced Settings | Add environment variable: Add environment variables for the data processing task runtime. For example, add a pair of variables, name ENV_MYSQL_INTERVAL, value 300, then you can use refresh_interval=ENV_MYSQL_INTERVAL in the res_rds_mysql function, the task will parse into refresh_interval=300. Note: |
Configuration Item | Description |
Task Name | Name of the data processing task, for example: my_transform. |
Enabling Status | Task start/stop, default start. |
Preprocessing Data | Turn off the switch. |
Source Log Topic | Data source of the data processing task. |
External data source | Add external data source, applicable to dimension table join scenarios. Currently only support Tencent Cloud MySQL, please refer to res_rds_mysql function. Region: The region where the cloud MySQL instance is located. TencentDB for MySQL Instance: Please select in the pull-down menu. Username: Enter your database username. Password: Enter your database password. Alias: Your MySQL alias name, which will be used in res_rds_mysql as parameter. Note: |
Process Time Range | Specify the log scope for data processing. Note: Process only data in the lifecycle of a log topic. |
Target Log Topic | Select Dynamic Log Topic. No configuration required for target log topic, it will be automatically generated according to the specified field value. |
Overrun handling | When the topic count generated by your data processing task exceeds the product spec, you can choose: Create a fallback logset and log topic, and write logs to the fallback topic (created when creating a task). Fallback logset: auto_undertake_logset, single-region single account next. Fallback topic: auto_undertake_topic_$(data processing task name). For example, if a user creates two data processing tasks etl_A and etl_B, two fallback topics will occur: auto_undertake_topic_etl_A, auto_undertake_topic_etl_B. Discard log data: Discard logs directly without creating a fallback topic. |
Data processing service log | The data processing task running logs are saved in the cls_service_log log topic (free). The Alarm feature in the monitoring chart depends on this log topic and is enabled by default. |
Upload Processing Failure Logs | When enabled, logs that failed to be processed will write into the target topic. When turned off, this option will drop processing-failed logs. |
Field Name in Processing Failure Logs | If your choice is to write failed logs to the target log topic, failure logs will be saved in this field, with a default name of ETLParseFailure. |
Advanced Settings | Add environment variable: Add environment variables for the data processing task runtime. For example, add a pair of variables, name ENV_MYSQL_INTERVAL, value 300, then you can use refresh_interval=ENV_MYSQL_INTERVAL in the res_rds_mysql function, the task will parse into refresh_interval=300. Note: |
{"content": "[2021-11-24 11:11:08,232][328495eb-b562-478f-9d5d-3bf7e][INFO] curl -H 'Host: ' http://abc.com:8080/pc/api -d {\\"version\\": \\"1.0\\",\\"user\\": \\"CGW\\",\\"password\\": \\"123\\"}"}
Dialogue Turn | User Question | AI Assistant Reply |
First-round dialogue | Structure this log |
|
Second-round dialogue | The content is not standard JSON. An error occurred when using ext_json. First extract the JSON part from the content, then extract the node from the JSON. |
|
{"level":"INFO","password":"123","requestid":"328495eb-b562-478f-9d5d-3bf7e","time":"2021-11-24 11:11:08,232","user":"CGW","version":"1.0"}



Function Category | Visualization Function Name | Application Scenario |
Extract Key Value | JSON: Extract fields and field values from JSON nodes Separator: Extract field values based on the separator. Users are advised to enter the field name. Regular Expression: Extract field values using regular expressions. User input is required for the field name. | Log Structuring |
Log Processing | Filter Logs: Configure conditions for filtering logs (multiple conditions are in an OR relationship). For example, if field A exists or field B does not exist, filter out the log. Distribute Logs: Configure conditions for distributing logs. If status="error" and message contains "404", distribute to topic A If status="running" and message contains "200", distribute to topic B Retain Logs: Configure conditions for preserving logs. | Delete/Retain Logs |
Field Processing | Delete fields Rename Field | Delete/Rename Field |
Feedback