Column information of the data table.
Used by actions: DescribeTaskResult.
Name | Type | Required | Description |
---|---|---|---|
Name | String | Yes | Column name, which is case-insensitive and can contain up to 25 characters. |
Type | String | Yes | Column type. Valid values: string|tinyint|smallint|int|bigint|boolean|float|double|decimal|timestamp|date|binary|array |
Comment | String | No | Class comment. Note: This field may return null, indicating that no valid values can be obtained. |
Precision | Integer | No | Length of the entire numeric value Note: This field may return null, indicating that no valid values can be obtained. |
Scale | Integer | No | Length of the decimal part Note: This field may return null, indicating that no valid values can be obtained. |
Nullable | String | No | Whether the column is null. Note: This field may return null, indicating that no valid values can be obtained. |
Position | Integer | No | Field position Note: This field may return null, indicating that no valid values can be obtained. |
CreateTime | String | No | Field creation time Note: This field may return null, indicating that no valid values can be obtained. |
ModifiedTime | String | No | Field modification time Note: This field may return null, indicating that no valid values can be obtained. |
IsPartition | Boolean | No | Whether the column is the partition field. Note: This field may return null, indicating that no valid values can be obtained. |
Common task metrics
Used by actions: DescribeSparkAppTasks, DescribeTasks.
Name | Type | Description |
---|---|---|
CreateTaskTime | Float | The task creation time in ms. Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
ProcessTime | Float | The processing time in ms. Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
QueueTime | Float | The queue time in ms. Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
ExecutionTime | Float | The execution duration in ms. Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
IsResultCacheHit | Boolean | Whether the result cache is hit. Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
MatchedMVBytes | Integer | The volume of matched materialized views, in bytes. Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
MatchedMVs | String | The list of matched materialized views. Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
AffectedBytes | String | The result data in bytes. Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
AffectedRows | Integer | The number of rows in the result. Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
ProcessedBytes | Integer | The volume of the data scanned, in bytes. Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
ProcessedRows | Integer | The number of scanned rows. Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
COS permissions
Used by actions: DescribeUserRoles.
Name | Type | Required | Description |
---|---|---|---|
CosPath | String | No | The COS path. Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
Permissions | Array of String | No | The permissions. Valid values: read and write .Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
Scheduled start and suspension information
Used by actions: CreateDataEngine.
Name | Type | Required | Description |
---|---|---|---|
ResumeTime | String | No | The scheduled start time, such as 8:00 AM every Monday. Note: This field may return null, indicating that no valid values can be obtained. |
SuspendTime | String | No | The scheduled suspension time, such as 8:00 PM every Monday. Note: This field may return null, indicating that no valid values can be obtained. |
SuspendStrategy | Integer | No | The suspension setting. Valid values: 0 (suspension after task end, default) and 1 (force suspension).Note: This field may return null, indicating that no valid values can be obtained. |
Engine configurations
Used by actions: CreateDataEngine.
Name | Type | Required | Description |
---|
The data governance rules.
Used by actions: CreateInternalTable, GenerateCreateMangedTableSql.
Name | Type | Required | Description |
---|---|---|---|
RuleType | String | No | Governance rule type. Valid values: Customize (custom) and Intelligence (intelligent).Note: This field may return null, indicating that no valid values can be obtained. |
GovernEngine | String | No | The governance engine. Note: This field may return null, indicating that no valid values can be obtained. |
SQL statement objects
Used by actions: GenerateCreateMangedTableSql.
Name | Type | Description |
---|---|---|
SQL | String | The automatically generated SQL statements. |
Query list filter parameter
Used by actions: DescribeSparkAppJobs, DescribeSparkAppTasks, DescribeTasks.
Name | Type | Required | Description |
---|---|---|---|
Name | String | Yes | Attribute name. If more than one filter exists, the logical relationship between these filters is OR . |
Values | Array of String | Yes | Attribute value. If multiple values exist in one filter, the logical relationship between these values is OR . |
Configuration format
Used by actions: CreateSparkSessionBatchSQL, CreateTask, CreateTasks, DescribeSparkSessionBatchSqlLog.
Name | Type | Required | Description |
---|---|---|---|
Key | String | Yes | Configured key Note: This field may return null, indicating that no valid values can be obtained. |
Value | String | Yes | Configured value Note: This field may return null, indicating that no valid values can be obtained. |
Permission objects
Used by actions: UpdateRowFilter.
Name | Type | Required | Description |
---|---|---|---|
Database | String | Yes | The name of the target database. * represents all databases in the current catalog. To grant admin permissions, it must be * ; to grant data connection permissions, it must be null; to grant other permissions, it can be any database. |
Catalog | String | Yes | The name of the target data source. To grant admin permission, it must be * (all resources at this level); to grant data source and database permissions, it must be COSDataCatalog or * ; to grant table permissions, it can be a custom data source; if it is left empty, DataLakeCatalog is used. Note: To grant permissions on a custom data source, the permissions that can be managed in the Data Lake Compute console are subsets of the account permissions granted when you connect the data source to the console. |
Table | String | Yes | The name of the target table. * represents all tables in the current database. To grant admin permissions, it must be * ; to grant data connection and database permissions, it must be null; to grant other permissions, it can be any table. |
Operation | String | Yes | The target permissions, which vary by permission level. Admin: ALL (default); data connection: CREATE ; database: ALL , CREATE , ALTER , and DROP ; table: ALL , SELECT , INSERT , ALTER , DELETE , DROP , and UPDATE . Note: For table permissions, if a data source other than COSDataCatalog is specified, only the SELECT permission can be granted here. |
PolicyType | String | No | The permission type. Valid values: ADMIN , DATASOURCE , DATABASE , TABLE , VIEW , FUNCTION , COLUMN , and ENGINE . Note: If it is left empty, ADMIN is used. |
Function | String | No | The name of the target function. * represents all functions in the current catalog. To grant admin permissions, it must be * ; to grant data connection permissions, it must be null; to grant other permissions, it can be any function.Note: This field may return null, indicating that no valid values can be obtained. |
View | String | No | The name of the target view. * represents all views in the current database. To grant admin permissions, it must be * ; to grant data connection and database permissions, it must be null; to grant other permissions, it can be any view.Note: This field may return null, indicating that no valid values can be obtained. |
Column | String | No | The name of the target column. * represents all columns. To grant admin permissions, it must be * .Note: This field may return null, indicating that no valid values can be obtained. |
DataEngine | String | No | The name of the target data engine. * represents all engines. To grant admin permissions, it must be * .Note: This field may return null, indicating that no valid values can be obtained. |
ReAuth | Boolean | No | Whether the grantee is allowed to further grant the permissions. Valid values: false (default) and true (the grantee can grant permissions gained here to other sub-users).Note: This field may return null, indicating that no valid values can be obtained. |
Source | String | No | The permission source, which is not required when input parameters are passed in. Valid values: USER (from the user) and WORKGROUP (from one or more associated work groups).Note: This field may return null, indicating that no valid values can be obtained. |
Mode | String | No | The grant mode, which is not required as an input parameter. Valid values: COMMON and SENIOR .Note: This field may return null, indicating that no valid values can be obtained. |
Operator | String | No | The operator, which is not required as an input parameter. Note: This field may return null, indicating that no valid values can be obtained. |
CreateTime | String | No | The permission policy creation time, which is not required as an input parameter. Note: This field may return null, indicating that no valid values can be obtained. |
SourceId | Integer | No | The ID of the work group, which applies only when the value of the Source field is WORKGROUP .Note: This field may return null, indicating that no valid values can be obtained. |
SourceName | String | No | The name of the work group, which applies only when the value of the Source field is WORKGROUP .Note: This field may return null, indicating that no valid values can be obtained. |
Id | Integer | No | The policy ID. Note: This field may return null, indicating that no valid values can be obtained. |
Presto monitoring metrics
Used by actions: DescribeSparkAppTasks, DescribeTasks.
Name | Type | Description |
---|---|---|
LocalCacheHitRate | Float | The Alluxio cache hit rate. Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
FragmentCacheHitRate | Float | The Fragment cache hit rate. Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
Properties of database and table
Used by actions: CreateInternalTable, GenerateCreateMangedTableSql.
Name | Type | Required | Description |
---|---|---|---|
Key | String | Yes | The property key name. |
Value | String | Yes | The property value. |
SQL query task
Used by actions: CreateTask.
Name | Type | Required | Description |
---|---|---|---|
SQL | String | Yes | Base64-encrypted SQL statement |
Config | Array of KVPair | No | Task configuration information |
The session resource configuration template for a Spark cluster.
Used by actions: CreateDataEngine.
Name | Type | Required | Description |
---|---|---|---|
DriverSize | String | No | The driver size. Valid values for the standard resource type: small , medium , large , and xlarge .Valid values for the memory resource type: m.small , m.medium , m.large , and m.xlarge .Note: This field may return null, indicating that no valid values can be obtained. |
ExecutorSize | String | No | The executor size. Valid values for the standard resource type: small , medium , large , and xlarge .Valid values for the memory resource type: m.small , m.medium , m.large , and m.xlarge .Note: This field may return null, indicating that no valid values can be obtained. |
ExecutorNums | Integer | No | The executor count. The minimum value is 1 and the maximum value is less than the cluster specification. Note: This field may return null, indicating that no valid values can be obtained. |
ExecutorMaxNumbers | Integer | No | The maximum executor count (in dynamic mode). The minimum value is 1 and the maximum value is less than the cluster specification. If you set ExecutorMaxNumbers to a value smaller than that of ExecutorNums , the value of ExecutorMaxNumbers is automatically changed to that of ExecutorNums .Note: This field may return null, indicating that no valid values can be obtained. |
Spark job details
Used by actions: DescribeSparkAppJob, DescribeSparkAppJobs.
Name | Type | Description |
---|---|---|
JobId | String | Spark job ID |
JobName | String | Spark job name |
JobType | Integer | Spark job type. Valid values: 1 (batch job), 2 (streaming job). |
DataEngine | String | Engine name |
Eni | String | This field has been disused. Use the Datasource field instead. |
IsLocal | String | Whether the program package is uploaded locally. Valid values: cos , lakefs . |
JobFile | String | Program package path |
RoleArn | Integer | Role ID |
MainClass | String | Main class of Spark job execution |
CmdArgs | String | Command line parameters of the Spark job separated by space |
JobConf | String | Native Spark configurations separated by line break |
IsLocalJars | String | Whether the dependency JAR packages are uploaded locally. Valid values: cos , lakefs . |
JobJars | String | Dependency JAR packages of the Spark job separated by comma |
IsLocalFiles | String | Whether the dependency file is uploaded locally. Valid values: cos , lakefs . |
JobFiles | String | Dependency files of the Spark job separated by comma |
JobDriverSize | String | Driver resource size of the Spark job |
JobExecutorSize | String | Executor resource size of the Spark job |
JobExecutorNums | Integer | Number of Spark job executors |
JobMaxAttempts | Integer | Maximum number of retries of the Spark flow task |
JobCreator | String | Spark job creator |
JobCreateTime | Integer | Spark job creation time |
JobUpdateTime | Integer | Spark job update time |
CurrentTaskId | String | Last task ID of the Spark job |
JobStatus | Integer | Last status of the Spark job |
StreamingStat | StreamingStatistics | Spark streaming job statistics Note: This field may return null, indicating that no valid values can be obtained. |
DataSource | String | Data source name Note: This field may return null, indicating that no valid values can be obtained. |
IsLocalPythonFiles | String | PySpark: Dependency upload method. 1: cos; 2: lakefs (this method needs to be used in the console but cannot be called through APIs). Note: This field may return null, indicating that no valid values can be obtained. |
AppPythonFiles | String | Note: This returned value has been disused. Note: This field may return null, indicating that no valid values can be obtained. |
IsLocalArchives | String | Archives: Dependency upload method. 1: cos; 2: lakefs (this method needs to be used in the console but cannot be called through APIs). Note: This field may return null, indicating that no valid values can be obtained. |
JobArchives | String | Archives: Dependency resources Note: This field may return null, indicating that no valid values can be obtained. |
SparkImage | String | The Spark image version. Note: This field may return null, indicating that no valid values can be obtained. |
JobPythonFiles | String | PySpark: Python dependency, which can be in .py, .zip, or .egg format. Multiple files should be separated by comma. Note: This field may return null, indicating that no valid values can be obtained. |
TaskNum | Integer | Number of tasks running or ready to run under the current job Note: This field may return null, indicating that no valid values can be obtained. |
DataEngineStatus | Integer | Engine status. -100 (default value): unknown; -2–11: normal. Note: This field may return null, indicating that no valid values can be obtained. |
JobExecutorMaxNumbers | Integer | The specified executor count (max), which defaults to 1. This parameter applies if the "Dynamic" mode is selected. If the "Dynamic" mode is not selected, the executor count is equal to JobExecutorNums .Note: This field may return null, indicating that no valid values can be obtained. |
SparkImageVersion | String | The image version. Note: This field may return null, indicating that no valid values can be obtained. |
SessionId | String | The ID of the associated Data Lake Compute query script. Note: This field may return null, indicating that no valid values can be obtained. |
DataEngineClusterType | String | spark_emr_livy indicates to create an EMR cluster.Note: This field may return null, indicating that no valid values can be obtained. |
DataEngineImageVersion | String | Spark 3.2-EMR indicates to use the Spark 3.2 image.Note: This field may return null, indicating that no valid values can be obtained. |
IsInherit | Integer | Whether the task resource configuration is inherited from the cluster template. Valid values: 0 (default): No; 1 : Yes.Note: This field may return null, indicating that no valid values can be obtained. |
IsSessionStarted | Boolean | Whether the task runs with the session SQLs. Valid values: false for no and true for yes.Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
Spark monitoring metrics
Used by actions: DescribeSparkAppTasks, DescribeTasks.
Name | Type | Description |
---|---|---|
ShuffleWriteBytesCos | Integer | The shuffle data (in bytes) that overflows to COS. Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
ShuffleWriteBytesTotal | Integer | The total shuffle data (in bytes). Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
Running logs of a Spark SQL batch job
Used by actions: DescribeSparkSessionBatchSqlLog.
Name | Type | Description |
---|---|---|
Step | String | The log step. Valid values: BEG , CS , DS , DSS , DSF , FINF , RTO , CANCEL , CT , DT , DTS , DTF , FINT , and EXCE .Note: This field may return null, indicating that no valid values can be obtained. |
Time | String | Time. Note: This field may return null, indicating that no valid values can be obtained. |
Message | String | The log message. Note: This field may return null, indicating that no valid values can be obtained. |
Operate | Array of SparkSessionBatchLogOperate | The operation. Note: This field may return null, indicating that no valid values can be obtained. |
Operation information in the logs of a Spark SQL batch job
Used by actions: DescribeSparkSessionBatchSqlLog.
Name | Type | Description |
---|---|---|
Text | String | The operation message. Note: This field may return null, indicating that no valid values can be obtained. |
Operate | String | The operation type. Valid values: COPY , LOG , UI , RESULT , List , and TAB .Note: This field may return null, indicating that no valid values can be obtained. |
Supplement | Array of KVPair | Additional information, such as taskid, sessionid, and sparkui. Note: This field may return null, indicating that no valid values can be obtained. |
Statistics of the Spark flow task
Used by actions: DescribeSparkAppJob, DescribeSparkAppJobs.
Name | Type | Description |
---|---|---|
StartTime | String | Task start time |
Receivers | Integer | Number of data receivers |
NumActiveReceivers | Integer | Number of receivers in service |
NumInactiveReceivers | Integer | Number of inactive receivers |
NumActiveBatches | Integer | Number of running batches |
NumRetainedCompletedBatches | Integer | Number of batches to be processed |
NumTotalCompletedBatches | Integer | Number of completed batches |
AverageInputRate | Float | Average input speed |
AverageSchedulingDelay | Float | Average queue time |
AverageProcessingTime | Float | Average processing time |
AverageTotalDelay | Float | Average latency |
Table field information
Used by actions: CreateInternalTable, GenerateCreateMangedTableSql.
Name | Type | Required | Description |
---|---|---|---|
Name | String | Yes | The field name. |
Type | String | Yes | The field type. |
Comment | String | No | The field description. |
Default | String | No | The default field value. |
NotNull | Boolean | No | Whether the field is not null. |
Table partition information
Used by actions: CreateInternalTable, GenerateCreateMangedTableSql.
Name | Type | Required | Description |
---|---|---|---|
Name | String | Yes | The field name. |
Type | String | No | The field type. |
Comment | String | No | The field description. |
PartitionType | String | No | The partition type. |
PartitionFormat | String | No | The partition format. |
PartitionDot | Integer | No | The separator count of the partition conversion policy. |
Transform | String | No | The partition conversion policy. |
TransformArgs | Array of String | No | The policy parameters. |
Table configurations
Used by actions: CreateInternalTable, GenerateCreateMangedTableSql.
Name | Type | Required | Description |
---|---|---|---|
DatabaseName | String | Yes | The database name. |
TableName | String | Yes | The table name. |
DatasourceConnectionName | String | No | The data source name. Note: This field may return null, indicating that no valid values can be obtained. |
TableComment | String | No | The table remarks. Note: This field may return null, indicating that no valid values can be obtained. |
Type | String | No | The specific type: table or view .Note: This field may return null, indicating that no valid values can be obtained. |
TableFormat | String | No | The data format type, such as hive and iceberg .Note: This field may return null, indicating that no valid values can be obtained. |
UserAlias | String | No | The table creator name. Note: This field may return null, indicating that no valid values can be obtained. |
UserSubUin | String | No | The table creator ID. Note: This field may return null, indicating that no valid values can be obtained. |
GovernPolicy | DataGovernPolicy | No | The data governance configuration. Note: This field may return null, indicating that no valid values can be obtained. |
DbGovernPolicyIsDisable | String | No | Whether database data governance is disabled. Valid values: true (disabled) and false (not disabled).Note: This field may return null, indicating that no valid values can be obtained. |
Tag pair info
Used by actions: CreateDataEngine.
Name | Type | Required | Description |
---|---|---|---|
TagKey | String | No | The tag key. Note: This field may return null, indicating that no valid values can be obtained. |
TagValue | String | No | The tag value. Note: This field may return null, indicating that no valid values can be obtained. |
Task type, such as SQL query.
Used by actions: CreateTask.
Name | Type | Required | Description |
---|---|---|---|
SQLTask | SQLTask | No | SQL query task |
SparkSQLTask | SQLTask | No | Spark SQL query task |
The task instance.
Used by actions: DescribeSparkAppTasks, DescribeTasks.
Name | Type | Description |
---|---|---|
DatabaseName | String | Database name of the task |
DataAmount | Integer | Data volume of the task |
Id | String | Task ID |
UsedTime | Integer | The compute time in ms. |
OutputPath | String | Task output path |
CreateTime | String | Task creation time |
State | Integer | The task status. Valid values: 0 (initializing), 1 (executing), 2 (executed), 3 (writing data), 4 (queuing), -1 (failed), and -3 (canceled). |
SQLType | String | SQL statement type of the task, such as DDL and DML. |
SQL | String | SQL statement of the task |
ResultExpired | Boolean | Whether the result has expired |
RowAffectInfo | String | Number of affected data rows |
DataSet | String | Dataset of task results Note: This field may return null, indicating that no valid values can be obtained. |
Error | String | Failure information, such as errorMessage . This field has been disused. |
Percentage | Integer | Task progress (%) |
OutputMessage | String | Output information of task execution |
TaskType | String | Type of the engine executing the SQL statement |
ProgressDetail | String | Task progress details Note: This field may return null, indicating that no valid values can be obtained. |
UpdateTime | String | Task end time Note: This field may return null, indicating that no valid values can be obtained. |
DataEngineId | String | Compute resource ID Note: This field may return null, indicating that no valid values can be obtained. |
OperateUin | String | Sub-UIN that executes the SQL statement Note: This field may return null, indicating that no valid values can be obtained. |
DataEngineName | String | Compute resource name Note: This field may return null, indicating that no valid values can be obtained. |
InputType | String | Whether the import type is local import or COS Note: This field may return null, indicating that no valid values can be obtained. |
InputConf | String | Import configuration Note: This field may return null, indicating that no valid values can be obtained. |
DataNumber | Integer | Number of data entries Note: This field may return null, indicating that no valid values can be obtained. |
CanDownload | Boolean | Whether the data can be downloaded Note: This field may return null, indicating that no valid values can be obtained. |
UserAlias | String | User alias Note: This field may return null, indicating that no valid values can be obtained. |
SparkJobName | String | Spark application job name Note: This field may return null, indicating that no valid values can be obtained. |
SparkJobId | String | Spark application job ID Note: This field may return null, indicating that no valid values can be obtained. |
SparkJobFile | String | JAR file of the Spark application entry Note: This field may return null, indicating that no valid values can be obtained. |
UiUrl | String | Spark UI URL Note: This field may return null, indicating that no valid values can be obtained. |
TotalTime | Integer | The task time in ms. Note: This field may return null, indicating that no valid values can be obtained. |
CmdArgs | String | The program entry parameter for running a task under a Spark job. Note: This field may return null, indicating that no valid values can be obtained. |
ImageVersion | String | The image version of the cluster. Note: This field may return null, indicating that no valid values can be obtained. |
DriverSize | String | The driver size. Valid values for the standard resource type: small , medium , large , and xlarge .Valid values for the memory resource type: m.small , m.medium , m.large , and m.xlarge .Note: This field may return null, indicating that no valid values can be obtained. |
ExecutorSize | String | The executor size. Valid values for the standard resource type: small , medium , large , and xlarge .Valid values for the memory resource type: m.small , m.medium , m.large , and m.xlarge .Note: This field may return null, indicating that no valid values can be obtained. |
ExecutorNums | Integer | The executor count. The minimum value is 1 and the maximum value is less than the cluster specification. Note: This field may return null, indicating that no valid values can be obtained. |
ExecutorMaxNumbers | Integer | The maximum executor count (in dynamic mode). The minimum value is 1 and the maximum value is less than the cluster specification. If you set ExecutorMaxNumbers to a value smaller than that of ExecutorNums , the value of ExecutorMaxNumbers is automatically changed to that of ExecutorNums .Note: This field may return null, indicating that no valid values can be obtained. |
CommonMetrics | CommonMetrics | Common task metrics Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
SparkMonitorMetrics | SparkMonitorMetrics | The Spark task metrics. Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
PrestoMonitorMetrics | PrestoMonitorMetrics | The Presto task metrics. Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
The task result information.
Used by actions: DescribeTaskResult.
Name | Type | Description |
---|---|---|
TaskId | String | Unique task ID |
DatasourceConnectionName | String | Name of the default selected data source when the current job is executed Note: This field may return null, indicating that no valid values can be obtained. |
DatabaseName | String | Name of the default selected database when the current job is executed Note: This field may return null, indicating that no valid values can be obtained. |
SQL | String | The currently executed SQL statement. Each task contains one SQL statement. |
SQLType | String | Type of the executed task. Valid values: DDL , DML , DQL . |
State | Integer | u200cThe current task status. Valid values: 0 (initializing), 1 (executing), 2 (executed), 3 (writing data), 4 (queuing), u200c-1 (failed), and -3 (canceled). Only when the task is successfully executed, a task execution result will be returned. |
DataAmount | Integer | Amount of the data scanned in bytes |
UsedTime | Integer | The compute time in ms. |
OutputPath | String | Address of the COS bucket for storing the task result |
CreateTime | String | Task creation timestamp |
OutputMessage | String | Task execution information. success will be returned if the task succeeds; otherwise, the failure cause will be returned. |
RowAffectInfo | String | Number of affected rows |
ResultSchema | Array of Column | Schema information of the result Note: This field may return null, indicating that no valid values can be obtained. |
ResultSet | String | Result information. After it is unescaped, each element of the outer array is a data row. Note: This field may return null, indicating that no valid values can be obtained. |
NextToken | String | Pagination information. If there is no more result data, nextToken will be empty. |
Percentage | Integer | Task progress (%) |
ProgressDetail | String | Task progress details |
DisplayFormat | String | Console display format. Valid values: table , text . |
TotalTime | Integer | The task time in ms. |
Collection of tasks executed sequentially in batches
Used by actions: CreateTasks.
Name | Type | Required | Description |
---|---|---|---|
TaskType | String | Yes | Task type. Valid values: SQLTask (SQL query task), SparkSQLTask (Spark SQL query task). |
FailureTolerance | String | Yes | Fault tolerance policy. Proceed : continues to execute subsequent tasks after the current task fails or is canceled. Terminate : terminates the execution of subsequent tasks after the current task fails or is canceled, and marks all subsequent tasks as canceled. |
SQL | String | Yes | Base64-encrypted SQL statements separated by ";". Up to 50 tasks can be submitted at a time, and they will be executed strictly in sequence. |
Config | Array of KVPair | No | Configuration information of the task. Currently, only SparkSQLTask tasks are supported. |
Params | Array of KVPair | No | User-defined parameters of the task |
The task overview.
Used by actions: DescribeTasks.
Name | Type | Description |
---|---|---|
TaskQueuedCount | Integer | The number of tasks in queue. |
TaskInitCount | Integer | The number of initialized tasks. |
TaskRunningCount | Integer | The number of tasks in progress. |
TotalTaskCount | Integer | The total number of tasks in this time range. |
User role
Used by actions: DescribeUserRoles.
Name | Type | Description |
---|---|---|
RoleId | Integer | The role ID. |
AppId | String | The user's app ID. |
Uin | String | The user ID. |
Arn | String | The role permission. |
ModifyTime | Integer | The last modified timestamp. |
Desc | String | The role description. |
RoleName | String | The role name. Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
Creator | String | The creator UIN. Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
CosPermissionList | Array of CosPermission | The COS permission list. Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
PermissionJson | String | The CAM policy in JSON. Note: u200dThis field may returnu200d·nullu200d, indicating that no valid values can be obtained. |
本页内容是否解决了您的问题?