tencent cloud

Data Lake Compute

Release Notes
Product Introduction
Overview
Strengths
Use Cases
Purchase Guide
Billing Overview
Refund
Payment Overdue
Configuration Adjustment Fees
Getting Started
Complete Process for New User Activation
DLC Data Import Guide
Quick Start with Data Analytics in Data Lake Compute
Quick Start with Permission Management in Data Lake Compute
Quick Start with Partition Table
Enabling Data Optimization
Cross-Source Analysis of EMR Hive Data
Standard Engine Configuration Guide
Configuring Data Access Policy
Operation Guide
Console Operation Introduction
Development Guide
Runtime Environment
SparkJar Job Development Guide
PySpark Job Development Guide
Query Performance Optimization Guide
UDF Function Development Guide
System Restraints
Client Access
JDBC Access
TDLC Command Line Interface Tool Access
Third-party Software Linkage
Python Access
Practical Tutorial
Accessing DLC Data with Power BI
Table Creation Practice
Using Apache Airflow to Schedule DLC Engine to Submit Tasks
Direct Query of DLC Internal Storage with StarRocks
Spark cost optimization practice
DATA + AI
Using DLC to Analyze CLS Logs
Using Role SSO to Access DLC
Resource-Level Authentication Guide
Implementing Tencent Cloud TCHouse-D Read and Write Operations in DLC
DLC Native Table
SQL Statement
SuperSQL Statement
Overview of Standard Spark Statement
Overview of Standard Presto Statement
Reserved Words
API Documentation
History
Introduction
API Category
Making API Requests
Data Table APIs
Task APIs
Metadata APIs
Service Configuration APIs
Permission Management APIs
Database APIs
Data Source Connection APIs
Data Optimization APIs
Data Engine APIs
Resource Group for the Standard Engine APIs
Data Types
Error Codes
General Reference
Error Codes
Quotas and limits
Operation Guide on Connecting Third-Party Software to DLC
FAQs
FAQs on Permissions
FAQs on Engines
FAQs on Features
FAQs on Spark Jobs
DLC Policy
Privacy Policy
Data Privacy And Security Agreement
Service Level Agreement
Contact Us

CreateTask

PDF
Focus Mode
Font Size
Last updated: 2025-11-13 20:53:24

1. API Description

Domain name for API request: dlc.intl.tencentcloudapi.com.

This API is used to create and execute a SQL task. (CreateTasks is recommended.)

A maximum of 50 requests can be initiated per second for this API.

We recommend you to use API Explorer
Try it
API Explorer provides a range of capabilities, including online call, signature authentication, SDK code generation, and API quick search. It enables you to view the request, response, and auto-generated examples.

2. Input Parameters

The following request parameter list only provides API request parameters and some common parameters. For the complete common parameter list, see Common Request Parameters.

Parameter Name Required Type Description
Action Yes String Common Params. The value used for this API: CreateTask.
Version Yes String Common Params. The value used for this API: 2021-01-25.
Region Yes String Common Params. For more information, please see the list of regions supported by the product.
Task Yes Task Computing task. This parameter contains the task type and related configuration information.
DatabaseName No String Database name. If there is a database name in the SQL statement, the database in the SQL statement will be used first; otherwise, the database specified by this parameter will be used (note: when submitting the database creation SQL statement, passed in an empty string for this field).
DatasourceConnectionName No String Name of the default data source
DataEngineName No String Data engine name. If this parameter is not specified, the task will be submitted to the default engine.
ResourceGroupName No String Standard spark execution task resourceGroupName

3. Output Parameters

Parameter Name Type Description
TaskId String Task ID
Note: This field may return null, indicating that no valid values can be obtained.
RequestId String The unique request ID, generated by the server, will be returned for every request (if the request fails to reach the server for other reasons, the request will not obtain a RequestId). RequestId is required for locating a problem.

4. Example

Example1 Creating and executing a Spark task

u200cThis example shows you how to create and execute a SQL task.

Input Example

POST / HTTP/1.1
Host: dlc.intl.tencentcloudapi.com
Content-Type: application/json
X-TC-Action: CreateTask
<Common request parameters>

{
    "Task": {
        "SQLTask": {
            "SQL": "U0VMRUNUICogRlJPTSBgdGVzdGh5d2AuYHRlc3QxMDBtYCBMSU1JVCAxMDs=",
            "Config": [
                {
                    "Key": "",
                    "Value": ""
                }
            ]
        },
        "SparkSQLTask": {
            "SQL": "",
            "Config": [
                {
                    "Key": "",
                    "Value": ""
                }
            ]
        }
    },
    "DatabaseName": "testdb"
}

Output Example

{
    "Response": {
        "RequestId": "13bfd2b2-b92e-4c49-9c7e-3662b5f32165",
        "TaskId": "4ad30ca9-8b0e-499f-b4e1-d6e43ba0e564"
    }
}

5. Developer Resources

SDK

TencentCloud API 3.0 integrates SDKs that support various programming languages to make it easier for you to call APIs.

Command Line Interface

6. Error Code

The following only lists the error codes related to the API business logic. For other error codes, see Common Error Codes.

Error Code Description
FailedOperation.NoPermissionToUseTheDataEngine The user does not have permission to specify the engine.
FailedOperation.SQLTaskParseFailed Syntax parsing failed. Please verify and try again.
InternalError An internal error occurred.
InternalError.DBError A database error occurred.
InternalError.InternalSystemException The business system is abnormal. Please try again or submit a ticket to contact us.
InvalidParameter The parameter is incorrect.
InvalidParameter.DataEngineOnlySupportSQL The current task only supports SQL engine operation.
InvalidParameter.ImageEngineTypeNotMatch The specified engine type does not match. Currently, only SparkSQL, PrestoSQL, and SparkBatch are supported.
InvalidParameter.ImageIsPublicNotMatch The specified isPublic does not match. Currently, it only supports 1: public and 2: private.
InvalidParameter.ImageParameterNotFound The specified cluster image parameter does not exist.
InvalidParameter.ImageParameterSubmitMethodNotMatch The specified cluster image ParameterSubmitMethod does not match. Currently, only User and BackGround are supported.
InvalidParameter.ImageParameterTypeNotMatch The specified cluster image ParameterType does not match. Currently, it only supports 1: session; 2: common; 3: cluster.
InvalidParameter.InvalidConfigKeyNotFound The specified task parameter Key value does not exist.
InvalidParameter.InvalidConfigValueLengthOutLimit The length of the specified task parameter Value exceeds the limit.
InvalidParameter.InvalidConfigValueRegexpNotMatch The specified task parameter Value does not conform to the rules.
InvalidParameter.InvalidDataEngineName The data engine name is invalid.
InvalidParameter.InvalidFailureTolerance The task fault tolerance type is wrong. Currently, only Proceed/Terminate is supported.
InvalidParameter.InvalidSQL SQL parsing failed.
InvalidParameter.InvalidSQLConfigSQL Parameter verification failed. Please adjust the parameters or submit a ticket to contact us.
InvalidParameter.InvalidSQLNum The number of executed SQL statements is incorrect. The number of SQL statements must be greater than or equal to 1 and less than or equal to 50.
InvalidParameter.InvalidStoreLocation The storage location is incorrect.
InvalidParameter.InvalidTaskType The TaskType is incorrect. The Spark engine task type is SparkSQLTask, and the Presto engine task type is SQLTask.
InvalidParameter.InvalidWhiteListKey There is an error in getting an allowlist. Please try again or submit a ticket to contact us.
InvalidParameter.ParameterBase64DecodeFailed Base64 parsing of the specified parameter failed.
InvalidParameter.ParameterNotFoundOrBeNone The parameter is not found or empty.
InvalidParameter.SQLBase64DecodeFail Base64 parsing of the SQL script failed.
InvalidParameter.SQLParameterPreprocessingFailed SQL parameter preprocessing failed.
ResourceNotFound The resource does not exist.
ResourceNotFound.DataEngineNotFound The specified engine does not exist.
ResourceNotFound.DataEngineNotRunning The specified engine is not running.
ResourceNotFound.DataEngineNotUnique The specified engine already exists.
ResourceNotFound.DefaultDataEngineNotFound No default engine can be found.
ResourceNotFound.ResultOutputPathNotFound The result path was not found.
ResourceUnavailable.BalanceInsufficient The account balance is insufficient to run the SQL task.
UnauthorizedOperation.UseComputingEngine The sub-user does not have permission to use the compute engine.
UnsupportedOperation.UnsupportedFileType The current file format is not supported. Currently, it only supports json/csv/avro/orc/parquet.

Help and Support

Was this page helpful?

Help us improve! Rate your documentation experience in 5 mins.

Feedback