tencent cloud

Data Transfer Service

Release Notes and Announcements
Release Notes
Announcements
Product Introduction
Overview
Data Migration
Data Sync
Data Subscription (Kafka Edition)
Strengths
Supported Regions
Specification Description
Purchase Guide
Billing Overview
Configuration Change Description
Payment Overdue
Refund
Getting Started
Data Migration Guide
Data Sync Guide
Data Subscription Guide (Kafka Edition)
Preparations
Business Evaluation
Network Preparation
Adding DTS IP Addresses to the Allowlist of the Corresponding Databases
DTS Service Permission Preparation
Database and Permission Preparation
Configuring Binlog in Self-Built MySQL
Data Migration
Databases Supported by Data Migration
Cross-Account TencentDB Instance Migration
Migration to MySQL Series
Migrating to PostgreSQL
Migrating to MongoDB
Migrating to SQL Server
Migrating to Tencent Cloud Distributed Cache
Task Management
Data Sync
Databases Supported by Data Sync
Cross-Account TencentDB Instance Sync
Sync to MySQL series
Synchronize to PostgreSQL
Synchronization to MongoDB
Synchronize to Kafka
Task Management
Data Subscription (Kafka Edition)
Databases Supported by Data Subscription
MySQL series Data Subscription
Data Subscription for TDSQL PostgreSQL
MongoDB Data Subscription
Task Management
Consumption Management
Fix for Verification Failure
Check Item Overview
Cutover Description
Monitoring and Alarms
Supported Monitoring Indicators
Supported Events
Configuring Metric Alarms and Event Alarms via the Console
Configuring Indicator Monitoring and Event Alarm by APIs
Ops Management
Configuring Maintenance Time
Task Status Change Description
Practical Tutorial
Synchronizing Local Database to the Cloud
Creating Two-Way Sync Data Structure
Creating Many-to-One Sync Data Structure
Creating Multi-Site Active-Active IDC Architecture
Selecting Data Sync Conflict Resolution Policy
Using CLB as Proxy for Cross-Account Database Migration
Migrating Self-Built Databases to Tencent Cloud Databases via CCN
Best Practices for DTS Performance Tuning
FAQs
Data Migration
Data Sync
FAQs for Data Subscription Kafka Edition
Regular Expressions for Subscription
Error Handling
Common Errors
Failed Connectivity Test
Failed or Alarmed Check Item
Inability to Select Subnet During CCN Access
Slow or Stuck Migration
Data Sync Delay
High Data Subscription Delay
Data Consumption Exception
API Documentation
History
Introduction
API Category
Making API Requests
(NewDTS) Data Migration APIs
Data Sync APIs
Data Consistency Check APIs
(NewDTS) Data Subscription APIs
Data Types
Error Codes
DTS API 2018-03-30
Service Agreement
Service Level Agreements

High Data Subscription Delay

PDF
フォーカスモード
フォントサイズ
最終更新日: 2024-07-08 15:45:26

Issue

The producer delay is too high in the data subscription service. The monitoring data shows that there is a large gap between the numbers of GTIDS in the data subscription service and the source database, and the data subscription service can only parse a small number of transactions per second.

Possible Causes

1. The source database is overloaded.
2. The source database writes data so quickly that the DTS data subscription service cannot parse it all.
3. There are large or complicated transactions written to the source database.

Troubleshooting

1. The source database is overloaded.

Check the source database monitoring metrics. If the write load of the source database is too high, a high subscription delay is to be expected. If the load is normal, further troubleshoot as follows.

2. The source database writes data so quickly that the DTS data subscription service cannot parse it all.

Check the generation speed of source database binlogs. If it exceeds 50 MB/sec, it is very likely that the data parsing capability of the DTS data subscription service has reached the upper limit. In this case, a high subscription delay is to be expected.
If the generation speed of source database binlogs is less than 50 MB/sec, further troubleshoot as follows.

3. There are large or complicated transactions written to the source database.

Check whether there are large transactions being executed in the source database or whether the current table contains large fields such as those of the "JSON" or "BLOB" type.

ヘルプとサポート

この記事はお役に立ちましたか?

フィードバック