tencent cloud

Data Lake Compute

Release Notes
Product Introduction
Overview
Strengths
Use Cases
Purchase Guide
Billing Overview
Refund
Payment Overdue
Configuration Adjustment Fees
Getting Started
Complete Process for New User Activation
DLC Data Import Guide
Quick Start with Data Analytics in Data Lake Compute
Quick Start with Permission Management in Data Lake Compute
Quick Start with Partition Table
Enabling Data Optimization
Cross-Source Analysis of EMR Hive Data
Standard Engine Configuration Guide
Configuring Data Access Policy
Operation Guide
Console Operation Introduction
Development Guide
Runtime Environment
SparkJar Job Development Guide
PySpark Job Development Guide
Query Performance Optimization Guide
UDF Function Development Guide
System Restraints
Client Access
JDBC Access
TDLC Command Line Interface Tool Access
Third-party Software Linkage
Python Access
Practical Tutorial
Accessing DLC Data with Power BI
Table Creation Practice
Using Apache Airflow to Schedule DLC Engine to Submit Tasks
Direct Query of DLC Internal Storage with StarRocks
Spark cost optimization practice
DATA + AI
Using DLC to Analyze CLS Logs
Using Role SSO to Access DLC
Resource-Level Authentication Guide
Implementing Tencent Cloud TCHouse-D Read and Write Operations in DLC
DLC Native Table
SQL Statement
SuperSQL Statement
Overview of Standard Spark Statement
Overview of Standard Presto Statement
Reserved Words
API Documentation
History
Introduction
API Category
Making API Requests
Data Table APIs
Task APIs
Metadata APIs
Service Configuration APIs
Permission Management APIs
Database APIs
Data Source Connection APIs
Data Optimization APIs
Data Engine APIs
Resource Group for the Standard Engine APIs
Data Types
Error Codes
General Reference
Error Codes
Quotas and limits
Operation Guide on Connecting Third-Party Software to DLC
FAQs
FAQs on Permissions
FAQs on Engines
FAQs on Features
FAQs on Spark Jobs
DLC Policy
Privacy Policy
Data Privacy And Security Agreement
Service Level Agreement
Contact Us

Function Management

PDF
Focus Mode
Font Size
Last updated: 2025-12-02 16:27:57
In DLC, you can use User-Defined Functions to process and construct data, and it supports function management.

Creating function

1. Log in to the DLC Console, select the service region. The account must have database operation permissions.
2. Portal 1: Go to the Metadata Management page, switch to the Database page, click the database name where you need to create a function, and switch to the Function page.

Portal 2: Go to the Metadata Management page and switch to the Function page.

3. Select Function, then click on the Create Function button to enter the function creation menu.

The function package supports either local uploads or the use of existing JAR or Python files stored in COS. For local uploads, the maximum file size for JAR files is 5 MB, and for Python files, the limit is 2 MB.
Python UDF registration is effective globally, and the configuration entry is as follows: Navigate to the Data Management page, switch to the Functions tab, and click Create. For the creation and management processes, refer to UDF Function Development Guide.
Select the Spark cluster to run the function. There will be no fees incurred during the execution.
It is recommended to save the function package to the system for easy management and use. It also supports mounting to a specified COS path.

View function information

1. Log in to the DLC Console. The account must have database operation permissions.
2. Choose Metadata Management > Database and click the database name of the function you want to view. Alternatively, go to the Metadata Management page and switch to the Function page for a global view.
3. Select the function to view its Build Status. If the build fails, click the Edit button on the right and submit again.
4. Click on the Function Name to directly view the function details.

Editing Function Information

1. Log in to the DLC Console, select the service region. The account must have database operation permissions.
2. Go to the Metadata Management page and click the database name of the function you want to view.
3. Select a Function and click the Edit button to enter the function information editing page.
The function name, storage method, and upload method cannot be modified at this time. If you need to change this information, please recreate the function.
After modifying the function information, it will be rebuilt. Please proceed with caution.

Deleting function

For functions that no longer need to be managed, you can delete them.
1. Go to the DLC Console, select the service region, log in to an account with database operation permissions.
2. Go to the Data Management Page, click on the database name of the function you want to view.
3. Select the function and click the delete button to delete the function that is no longer needed.

Note:
After deletion, the data under this function will be cleared and cannot be recovered. Please proceed with caution.

Help and Support

Was this page helpful?

Help us improve! Rate your documentation experience in 5 mins.

Feedback