tencent cloud

Tencent Cloud WeData

Release Notes
Dynamic Release Record (2026)
Product Introduction
Product Overview
Product Advantages
Product Architecture
Product Features
Application Scenarios
Purchase Guide
Billing Overview
Product Version Purchase Instructions
Execute Resource Purchase Description
Billing Modes
Overdue Policy
Refund
Preparations
Overview of Account and Permission Management
Add allowlist /security groups (Optional)
Sign in to WeData with Microsoft Entra ID (Azure AD) Single Sign-On (SSO)
Operation Guide
Console Operation
Project Management
Data Integration
Studio
Data Development
Data Analysis
Data Science
Data Governance (with Unity Semantics)
API Documentation
History
Introduction
API Category
Making API Requests
Smart Ops Related Interfaces
Project Management APIs
Resource Group APIs
Data Development APIs
Data Asset - Data Dictionary APIs
Data Development APIs
Ops Center APIs
Data Operations Related Interfaces
Data Exploration APIs
Asset APIs
Metadata Related Interfaces
Task Operations APIs
Data Security APIs
Instance Operation and Maintenance Related Interfaces
Data Map and Data Dictionary APIs
Data Quality Related Interfaces
DataInLong APIs
Platform Management APIs
Data Source Management APIs
Data Quality APIs
Platform Management APIs
Asset Data APIs
Data Source Management APIs
Data Types
Error Codes
WeData API 2025-08-06
Service Level Agreements
Related Agreement
Privacy Policy
Data Processing And Security Agreement
Contact Us
Glossary

Parameter Passing

PDF
Mode fokus
Ukuran font
Terakhir diperbarui: 2025-04-15 21:38:52

Feature Overview

Used to support the transfer of parameters between upstream and downstream. For example, transmit the calculation results of the current task as parameters to the subtask for use. Multiple results can be transmitted by specifying the object with the column number.
Supported range: Hive SQL, JDBC SQL, Python, Shell, Spark SQL, DLC SQL, Impala, TCHouse-P, DLC PySpark.

Parameter Configuration Instructions

1. Output current task parameters

If you need to transmit parameters from the current task to downstream, select Output current task parameters in Parameter Passing and configure parameter information.
Output parameter passing to subtasks, value in the following two types:
Variables (n, m starting from 0), support transmission of two-dimensional arrays, specific configuration method as follows:
$[n][m]: Retrieve the data of the n-th row and m-th column.
$[n][*]: Represent obtaining the data of the n-th row.
$[*][n]: Indicate obtaining the data of the n-th column.
$[*][*]: Represent retrieving the data of all rows and columns.
$[0]: Represent obtaining the data of the first row and first column.
constant: set constant as output parameter.
For example: Parent task A. The calculation results in code have 3 columns, which are 123, 234, and 1234 respectively. Define an output parameter name mark_id in this configuration. Fill in mark_id = $[0], indicating taking the value of the first row and first column of calculation results.

1. Reference parent task parameters

If the current task needs to refer to the parameters defined in the parent task, check the option of referencing parent task parameters in the parameter passing.
Fill in the definition's import parameters and select the output parameter of the parent task for this parameter value (no options if no parent node). For example: Subtask B. Define any parameter name mark_id. Select the value of node A.mark_id. Use ${mark_id} in code, and ${mark_id} will be replaced with string 123.
Example: As shown in the figure, define parameter name a and take the parameter a defined in the upstream parent task hivesql_1; define parameter name b and take the parameter b defined in the upstream parent task hivesql_1.




Use Case

Divided into three types: SQL type, Shell type, Python type. Following are usage examples of the three types (the following context takes transmitting constants as an example, and supports transmitting variables).
Type
Task type
As Output Party (Upstream)
As Input Party (Downstream)
Remarks
Parameter configuration
Code
Parameter configuration
Code
SQL class
Hive SQL
JDBC SQL
Spark SQL
DLC SQL
Impala
TCHouse-P



SELECT "This is the value of the parameter"


SELECT '${parameter}' AS parameter_value;

Shell class
Shell



echo "This is the value of the parameter"


expr ${parameter}

Python class
Python



print("This is the value of the parameter")


print('${parameter}')

DLC PySpark class
DLC PySpark

from dlcutils import dlcutils

dlcutils.params.save([["param1","param2","param3"],["param4","param5","param6"]])

print('${parameter}')
Usage limitations as the Output Party: Only the DLC engine with kernel versions Standard-S1.1 and Standard-S1.1(native) is supported. This feature is supported for engines created after March 30, 2025. For engines created earlier, an upgrade is required. If you need to upgrade the engine, please contact the DLC operations team.



Bantuan dan Dukungan

Apakah halaman ini membantu?

masukan