tencent cloud

TDMQ for CKafka

Release Notes and Announcements
Release Notes
Broker Release Notes
Announcement
Product Introduction
Introduction and Selection of the TDMQ Product Series
What Is TDMQ for CKafka
Strengths
Scenarios
Technology Architecture
Product Series Introduction
Apache Kafka Version Support Description
Comparison with Apache Kafka
High Availability
Use Limits
Regions and AZs
Related Cloud Services
Billing
Billing Overview
Pricing
Billing Example
Changing from Postpaid by Hour to Monthly Subscription
Renewal
Viewing Consumption Details
Overdue Payments
Refund
Getting Started
Guide for Getting Started
Preparations
VPC Network Access
Public Domain Name Access
User Guide
Usage Process Guide
Configuring Account Permission
Creating Instance
Configuring Topic
Connecting Instance
Managing Messages
Managing Consumer Group
Managing Instance
Changing Instance Specification
Configuring Traffic Throttling
Configuring Elastic Scaling Policy
Configuring Advanced Features
Viewing Monitoring Data and Configuring Alarm Rules
Synchronizing Data Using CKafka Connector
Use Cases
Cluster Resource Assessment
Client Practical Tutorial
Log Integration
Open-Source Ecosystem Integration
Replacing Supporting Route (Old)
Migration Guide
Migration Solution Overview
Migrating Cluster Using Open-Source Tool
Troubleshooting
Topics
Clients
Messages
​​API Reference
History
Introduction
API Category
Making API Requests
Other APIs
ACL APIs
Instance APIs
Routing APIs
DataHub APIs
Topic APIs
Data Types
Error Codes
SDK Reference
SDK Overview
Java SDK
Python SDK
Go SDK
PHP SDK
C++ SDK
Node.js SDK
SDK for Connector
Security and Compliance
Permission Management
Network Security
Deletion Protection
Event Record
CloudAudit
FAQs
Instances
Topics
Consumer Groups
Client-Related
Network-Related
Monitoring
Messages
Agreements
CKafka Service Level Agreements
Contact Us
Glossary
DocumentationTDMQ for CKafkaUse CasesCluster Resource AssessmentConducting Production and Consumption Load Testing on CKafka

Conducting Production and Consumption Load Testing on CKafka

PDF
Focus Mode
Font Size
Last updated: 2026-01-20 17:10:13

Tools for Tests

The open-source scripts of the Kafka client can be used for Kafka producer and consumer performance testing. Test results are displayed mainly based on the size of messages sent per second (MB/second) and the number of messages sent per second (records/second).
Kafka producer test script: $KAFKA_HOME/bin/kafka-producer-perf-test.sh
Kafka consumer test script: $KAFKA_HOME/bin/kafka-consumer-perf-test.sh

Testing Commands

Note
ckafka vip:vport in the following sample commands should be replaced by the actual IP address and port assigned for your instance.
Production testing command example:
bin/kafka-producer-perf-test.sh
--topic test
--num-records 123
--record-size 1000
--producer-props bootstrap.servers= ckafka vip : port
--throughput 20000
Consumption testing command example:
bin/kafka-consumer-perf-test.sh
--topic test
--new-consumer
--fetch-size 10000
--messages 1000
--broker-list bootstrap.servers=ckafka vip : port

Suggestions for Tests

It is recommended to create three or more partitions to increase throughput. This is because there must be at least three TDMQ for CKafka (CKafka) cluster nodes at the backend. If only one partition is created, it will be distributed in a single broker, which will affect CKafka performance.
As messages in each CKafka partition are ordered, the production performance will be affected if there are too many partitions. It is recommended that the number of partitions should not exceed 6 according to actual load testing.
It is necessary to simulate concurrency with multiple clients to ensure the effect of load testing. It is recommended that you use multiple servers as load testing clients (producers) and start multiple load testing programs on each server to increase concurrency. To avoid high load on testing servers, it is also recommended that you start one producer every second rather than all producers simultaneously.

Help and Support

Was this page helpful?

Help us improve! Rate your documentation experience in 5 mins.

Feedback