tencent cloud

Data Transfer Service

소식 및 공지 사항
릴리스 노트
제품 소개
제품 개요
데이터 마이그레이션 기능 설명
데이터 동기화 기능 설명
데이터 구독(Kafka 버전) 기능 설명
제품 장점
구매 가이드
과금 개요
환불 설명
시작하기
데이터 마이그레이션 작업 가이드
데이터 동기화 작업 가이드
데이터 구독 작업 가이드(Kafka 버전)
준비 작업
자체구축 MySQL용 Binlog 설정
데이터 마이그레이션
데이터 마이그레이션 지원 데이터베이스
ApsaraDB 교차 계정 인스턴스 간 마이그레이션
PostgreSQL로 마이그레이션
작업 관리
데이터 동기화
데이터 동기화가 지원하는 데이터베이스
계정 간 TencentDB 인스턴스 동기화
작업 관리
데이터 구독(Kafka 버전)
데이터 구독이 지원하는 데이터베이스
데이터 구독 작업 생성
작업 관리
컷오버 설명
모니터링 및 알람
모니터링 메트릭 조회
사례 튜토리얼
양방향 동기화 데이터 구조 생성
다대일 동기화 데이터 구조 생성
멀티 사이트 Active-Active IDC 구축
데이터 동기화 충돌 해결 정책 선택하기
CLB 프록시를 사용하여 계정 간 데이터베이스 마이그레이션하기
CCN으로 자체 구축 MySQL에서 TencentDB for MySQL로 마이그레이션
검증 불통과 처리 방법
버전 확인
원본 데이터베이스 권한 확인
계정 충돌 확인
부분 데이터베이스 매개변수 확인
원본 인스턴스 매개변수 확인
매개변수 설정 충돌 확인
대상 데이터베이스 콘텐츠 충돌 확인
대상 데이터베이스 공간 확인
Binlog 매개변수 확인
증분 마이그레이션 전제 조건 확인
플러그인 호환성 확인
레벨2 파티션 테이블 확인
기본 키 확인
마이그레이션할 테이블에 대한 DDL 확인
시스템 데이터베이스 충돌 확인
소스 및 대상 인스턴스 테이블 구조 확인
InnoDB 테이블 확인
마이그레이션 객체 종속성 확인
제약 조건 확인
FAQs
데이터 마이그레이션
데이터 동기화
데이터 구독 Kafka 버전 FAQ
구독 정규식
API문서
History
Introduction
API Category
Making API Requests
(NewDTS) Data Migration APIs
Data Sync APIs
Data Consistency Check APIs
(NewDTS) Data Subscription APIs
Data Types
Error Codes
DTS API 2018-03-30
Service Agreement
Service Level Agreements
액세스 관리
DTS를 사용할 서브 계정 생성 및 권한 부여
서브 계정에 재무 권한 부여하기
문서Data Transfer Service

Consuming MongoDB Data

포커스 모드
폰트 크기
마지막 업데이트 시간: 2024-09-11 14:22:51

Overview

In data subscription (Kafka Edition, where the current Kafka server version is v2.6.0), you can consume the subscribed data through Kafka 0.11 or later available at DOWNLOAD. This document provides client consumption demos for you to quickly test the process of data consumption and understand the method of data parsing.

Note

The demo only prints out the consumed data and does not contain any usage instructions. You need to write your own data processing logic based on the demo. You can also use Kafka clients in other languages to consume and parse data.
Currently, data subscription to Kafka for consumption can be implemented over the Tencent Cloud private network but not the public network. In addition, the subscribed database instance and the data consumer must be in the same region.
The Kafka built in DTS data subscription has an upper limit for processing a single message. When a single row of data in the source database exceeds 5 MB, the subscription task may report an error.
To ensure that data can be rewritten ‍from where the task is paused, DTS adopts the checkpoint mechanism for data subscription. Specifically, when messages are written to Kafka topics, a checkpoint message is inserted every 10 seconds to mark the data sync offset. When the task is resumed after being interrupted, data can be rewritten from the checkpointed offeset. The consumer commits a consumption offset every time it encounters a checkpoint message so that the consumption offset can be updated timely.

Downloading Consumption Demos

Currently, MongoDB data subscription only supports the JSON format. The following demos already contain the JSON protocol file, so you don't need to download it separately.
The consumption demo uses the native changeStream format. For details, see the MongoDB Manual.
Demo Language
JSON Demo Download
Go
Java
Python

Directions for the Java Demo

Compiling environment: Maven and JDK8. You can choose a desired package management tool. The following takes Maven as an example. Runtime environment: Tencent Cloud CVM (which can access the private network address of the Kafka server only if it is in the same region as the subscribed instance). Install JRE 8. Follow the steps below:
1. Create a data subscription task (NewDTS) as instructed in Creating Data Subscription to TencentDB for MariaDB.
2. Create one or multiple consumer groups as instructed in Creating Consumer Group.
3. Download the Java demo and decompress it.
4. Access the decompressed directory. Maven model and pom.xml files are placed in the directory for your use as needed. Package with Maven by running mvn clean package.
5. Run the demo. After packaging the project with Maven, go to the target folder target and run the following code: java -jar consumerDemo-json-1.0-SNAPSHOT.jar --brokers xxx --topic xxx --group xxx --user xxx --password xxx --trans2sql --trans2canal
brokers is the private network access address for data subscription to Kafka, and topic is the subscription topic, which can be viewed on the Subscription details page as instructed in Viewing Subscription Details.
group, user, and password are the name, account, and password of the consumer group, which can be viewed on the Consumption Management page as instructed in Managing Consumer Group.
trans2sql indicates whether to enable conversion to SQL statement. In Java code, if this parameter is carried, the conversion will be enabled.
trans2canal indicates whether to print the data in Canal format. If this parameter is carried, the conversion will be enabled. Currently, this parameter is only used for data in JSON format.
6. Observe the consumption.



Directions for the Golang Demo

Compiling environment: Go 1.12 or later, with the Go module environment configured. Runtime environment: Tencent Cloud CVM (which can access the private network address of the Kafka server only if it is in the same region as the subscribed instance). Follow the steps below:
1. Create a data subscription task (NewDTS) as instructed in Creating MySQL or TDSQL for MySQL Data Subscription.
2. Create one or multiple consumer groups as instructed in Creating Consumer Group.
3. Download the Go demo and decompress it.
4. Access the decompressed directory and run go build -o subscribe ./main/main.go to generate the executable file subscribe.
5. Run ./subscribe --brokers=xxx --topic=xxx --group=xxx --user=xxx --password=xxx --trans2sql=true.
brokers is the private network access address for data subscription to Kafka, and topic is the subscription topic, which can be viewed on the Subscription details page as instructed in Viewing Subscription Details.
group, user, and password are the name, account, and password of the consumer group, which can be viewed on the Consumption Management page as instructed in Managing Consumer Group.
trans2sql indicates whether to enable conversion to SQL statement.
6. Observe the consumption.



Directions for the Python3 Demo

Compiling and runtime environment: Tencent Cloud CVM (which can access the private network address of the Kafka server only if it is in the same region as the subscribed instance). Install Python 3 and pip3 (for dependency package installation). Use pip3 to install the dependency package:
pip install flag
pip install kafka-python
Follow the steps below:
1. Create a data subscription task (NewDTS) as instructed in Creating MySQL or TDSQL for MySQL Data Subscription.
2. Create one or multiple consumer groups as instructed in Creating Consumer Group.
3. Download Python3 demo and decompress it.
4. Run python main.py --brokers=xxx --topic=xxx --group=xxx --user=xxx --password=xxx --trans2sql=1.
brokers is the private network access address for data subscription to Kafka, and topic is the subscription topic, which can be viewed on the Subscription details page as instructed in Viewing Subscription Details.
group, user, and password are the name, account, and password of the consumer group, which can be viewed on the Consumption Management page as instructed in Managing Consumer Group.
trans2sql indicates whether to enable conversion to SQL statement.
5. Observe the consumption.



도움말 및 지원

문제 해결에 도움이 되었나요?

피드백