tencent cloud

Data Transfer Service

Release Notes and Announcements
Release Notes
Announcements
Product Introduction
Overview
Data Migration
Data Sync
Data Subscription (Kafka Edition)
Strengths
Supported Regions
Specification Description
Purchase Guide
Billing Overview
Configuration Change Description
Payment Overdue
Refund
Getting Started
Data Migration Guide
Data Sync Guide
Data Subscription Guide (Kafka Edition)
Preparations
Business Evaluation
Network Preparation
Adding DTS IP Addresses to the Allowlist of the Corresponding Databases
DTS Service Permission Preparation
Database and Permission Preparation
Configuring Binlog in Self-Built MySQL
Data Migration
Databases Supported by Data Migration
Cross-Account TencentDB Instance Migration
Migration to MySQL Series
Migrating to PostgreSQL
Migrating to MongoDB
Migrating to SQL Server
Migrating to Tencent Cloud Distributed Cache
Task Management
Data Sync
Databases Supported by Data Sync
Cross-Account TencentDB Instance Sync
Sync to MySQL series
Synchronize to PostgreSQL
Synchronization to MongoDB
Synchronize to Kafka
Task Management
Data Subscription (Kafka Edition)
Databases Supported by Data Subscription
MySQL series Data Subscription
Data Subscription for TDSQL PostgreSQL
MongoDB Data Subscription
Task Management
Consumption Management
Fix for Verification Failure
Check Item Overview
Cutover Description
Monitoring and Alarms
Supported Monitoring Indicators
Supported Events
Configuring Metric Alarms and Event Alarms via the Console
Configuring Indicator Monitoring and Event Alarm by APIs
Ops Management
Configuring Maintenance Time
Task Status Change Description
Practical Tutorial
Synchronizing Local Database to the Cloud
Creating Two-Way Sync Data Structure
Creating Many-to-One Sync Data Structure
Creating Multi-Site Active-Active IDC Architecture
Selecting Data Sync Conflict Resolution Policy
Using CLB as Proxy for Cross-Account Database Migration
Migrating Self-Built Databases to Tencent Cloud Databases via CCN
Best Practices for DTS Performance Tuning
FAQs
Data Migration
Data Sync
FAQs for Data Subscription Kafka Edition
Regular Expressions for Subscription
Error Handling
Common Errors
Failed Connectivity Test
Failed or Alarmed Check Item
Inability to Select Subnet During CCN Access
Slow or Stuck Migration
Data Sync Delay
High Data Subscription Delay
Data Consumption Exception
API Documentation
History
Introduction
API Category
Making API Requests
(NewDTS) Data Migration APIs
Data Sync APIs
Data Consistency Check APIs
(NewDTS) Data Subscription APIs
Data Types
Error Codes
DTS API 2018-03-30
Service Agreement
Service Level Agreements

Creating Data Subscription for TDSQL PostgreSQL Edition

PDF
フォーカスモード
フォントサイズ
最終更新日: 2024-07-11 15:25:23
This scene introduces the instructions for using DTS to create a data subscription task for Tencent Cloud TDSQL PostgreSQL edition.

Prerequisites

You have prepared TDSQL PostgreSQL edition to be subscribed to, and the database edition meets the requirements. Please see Databases Supported by Data Subscription.
You have created a subscription account in the source instance, requiring the account to have the following permissions: LOGIN and REPLICATION. For LOGIN and REPLICATION authorization, please submit a ticket.
The subscription account must have the select permission on the table to be subscribed to. For a complete database subscription, the subscription account must have the select permission for all tables under the schema. Specific authorization statements are as follows:
grant SELECT on all tables in schema "schema_name" to "migration account";
Users must have the SELECT permission on the pg_catalog.pgxc_node table. The specific authorization statement is as follows:
grant SELECT on pg_catalog.pgxc_node to "migration account";
The wal_level of DN node must be logical.
If the table being subscribed to is a full replication table (the table creation statement contains the distribute by replication keyword), it must have a primary key; if the table being subscribed to is not a full replication table, it must have a primary key or have REPLICA IDENTITY as FULL; statement to modify the table's REPLICA IDENTITY to FULL:
alter table "table name" REPLICA IDENTITY FULL;

Restrictions

Subscribed messages are stored in DTS built-in Kafka (single Topic), with a default retention time of 1 day as of now. A single Topic can have a maximum Storage capacity of 500 G. When the data Storage period exceeds 1 day, or the data volume surpasses 500 G, the built-in Kafka will start to clear the earliest written data. Therefore, users should consume data promptly to avoid it being cleared before consumption.
The region of data consumption needs to match the region of the subscription task.
Currently not supporting data types related to gtsvector, pg_dependencies, pg_node_tree, pg_ndistinct, xml.
When the data subscription source is TDSQL PostgreSQL edition, direct execution of authorization statements is not supported. Therefore, permissions for the subscription account need to be configured through the TDSQL Console by clicking the instance ID, accessing the instance log-in information, and then authorizing the account via client log-in to the database.
During a subscription task, if operations such as modifying the Subscription object occur, it will cause the task to restart. After restarting, it might lead to duplicates when consuming data on the Kafka client.
DTS transmits data by the smallest data cell. Each marked checkpoint position represents a data cell. If a data cell has been completely transmitted when the task restarts, it will not cause data duplication; if a data cell is still being transmitted during a restart, it will need to be fetched again after the restart to ensure data integrity, leading to data duplication.
If users are concerned about duplicate data, please set up deduplication logic when consuming data.

SQL Operations for Subscription Supported

Operation Type
Supported SQL Operations
DML
INSERT, UPDATE, DELETE

Directions

1. Log in to DTS Console, choose Data Subscription in the left sidebar, and click Create Subscription.
2. On the page for creating a new Data Subscription, select the appropriate configuration and click Buy Now.
Billing Mode: Supports Monthly Subscription and Pay as you go.
Region: The region must be the same as that of the database instance to be subscribed to.
Database: Select your actual database type.
Edition: Select Kafka Edition, which supports direct consumption through a Kafka client.
Subscription Instance Name: Edit the current name of data subscription instance.
3. After a successful purchase, return to the data subscription list, click Operation column's Configure Subscription to configure the newly purchased subscription. You can use it only after the configuration is complete.
4. On the Configure data subscription page, select the appropriate configuration and click Next.
Instance: Select the corresponding database instance. Currently, read-only and disaster recovery instances do not support data subscription.
Database Account: Add the account and password of the subscription instance, the LOGIN and REPLICATION permissions of the account and and the SELECT permissions for all objects and the pg_catalog.pgxc_node table.
5. On the Subscription Type and Object Selection page, choose the type of subscription and click Save.
The subscription type is Data Update (subscribes to data updates for selected objects, including INSERT, UPDATE, DELETE operations).
Kafka partition strategy: Support partitioning by table names.
6. On the pre-verification page, the pre-verification task is expected to run for 2–3 minutes. After passing pre-verification, click Start to complete the data subscription task configuration.
Note:
If verification fails, modify the task in the instance to be subscribed to as prompted and initiate the verification again.
7. After clicking Start, the subscription task will initialize, expected to take 3–4 minutes. Once initialization is successful, it enters the Running status.
8. Creating Consumer Group, the Kafka edition of data subscription supports creating multiple consumer groups for multi-point consumption. Consumption in the Kafka edition relies on Kafka’s consumer groups, hence it is necessary to create a consumer group before data consumption.
9. After the subscription instance enters the Running status, data consumption can begin. Consumption in Kafka requires password authentication, for specific examples, please see Data Consumption Demo. We provide Demo code in multiple languages and explain the main process of consumption and key data structures.

ヘルプとサポート

この記事はお役に立ちましたか?

フィードバック