tencent cloud

Tencent Cloud TCHouse-D

Product Introduction
Overview
Concepts
Cluster Architecture
Strengths
Scenarios
Purchase Guide
Billing Overview
Renewal Instructions
Overdue Policy
Refund Instructions
Configuration Adjustment Billing Instructions
Getting Started
Using Tencent Cloud TCHouse-D Through the Console
Using Tencent Cloud TCHouse-D Through a Client
Operation Guide
Cluster Operation
Monitoring and Alarm Configuration
Account Privilege Management
Data Management
Query Management
Modify Configurations
Node Management
Log Analysis
SQL Studio
Enabling Resource Isolation
Development Guide
Design of Data Table
Importing Data
Exporting Data
Basic Feature
Query Optimization
Ecological Expansion Feature
API Documentation
History
Introduction
API Category
Making API Requests
Cluster Operation APIs
Database and Table APIs
Cluster Information Viewing APIs
Hot-Cold Data Layering APIs
Database and Operation Audit APIs
User and Permission APIs
Resource Group Management APIs
Data Types
Error Codes
Cloud Ecosystem
Granting CAM Policies to Sub-accounts
Query Acceleration for Tencent Cloud DLC
Practical Tutorial
Basic Feature Usage
Advanced Features Usage
Resource Specification Selection and Optimization Suggestions
Naming Specifications and Limits to the Database and Data Table
Table Design and Data Import
Query Optimization
Suggested Usage to Avoid
Accessing TCHouse-D via JDBC over the Public Network
Performance Testing
TPC-H Performance Testing
SSB Performance Testing
TPC-DS Performance Testing
FAQs
Common Operational Issues
Common Errors
Contact Us
Glossary
Product Policy
Service Level Agreement
Privacy Policy
Data Processing And Security Agreement

Test Results for Reference

PDF
Focus Mode
Font Size
Last updated: 2025-01-07 11:19:57
This document will provide a reference for the results of the performance test of Tencent Cloud TCHouse-D using TPC-H (Business Intelligence Computing Test).

About TPC-H Performance Test

TPC-H is a decision support benchmark that consists of a set of business-oriented ad hoc queries and concurrent data modifications. The data it queries and populates in the database is extensively industry-related. This benchmark test demonstrates the ability of a decision support system to examine large amounts of data, perform highly complex queries, and answer critical business questions. The performance metric reported by TPC-H is called TPC-H Composite Query-per-Hour Performance Metric (QphH@Size), which reflects the system's ability to process multiple queries.

Test Environment

Hardware Environment

In this document, two model clusters are tested, each containing 1 FE and 3 BEs. The FE/BE node processes are deployed separately. It should be noted that the principle for selecting a cluster of models is to be close to common user configurations, and in actual testing, such a large amount of hardware resources will not be consumed.
Cluster Specifications
Node Type
Specification
Specification 1 (small and medium-sized data scenes)
1 FE
CPU:4 cores
Memory: 16 GB
Hard disk: Enhanced SSD Cloud Disk 200 GB
3 BEs
CPU:16 cores
Memory: 64 GB
Hard disk: Enhanced SSD Cloud Disk 1500 GB
Specification 2 (large-scale data scenes)
1 FE
CPU:16 cores
Memory: 64 GB
Hard disk: Enhanced SSD Cloud Disk 200 GB
3 BEs
CPU:64 cores
Memory: 256 GB
Hard disk: Enhanced SSD Cloud Disk 1500 GB

Software Version

Tencent Cloud TCHouse-D 1.2.7

Test Results for Reference

Test Data

The test is conducted using two data sets, Scale 100 and Scale 1000. The descriptions and data volume of the created tables are as follows:
TPC-H table name
Number of rows - Scale 100
Number of rows - Scale 1000
Remarks
REGION
5
5
Region Table
NATION
25
25
Country Table
SUPPLIER
1 million
10 million
Supplier Table
PART
20 million
200 million
Parts List
PARTSUPP
80 million
800 million
Parts Supply List
CUSTOMER
15 million
150 million
Customer Table
ORDERS
150 million
1.5 billion
Order Table
LINEITEM
600 million
6 billion
Order Details Table

Performance Test Results

Note:
The test result is the average time of three queries of the corresponding SQL file, in seconds.
Query Number
Specification 1-Scale 100 Data Set
Specification 2-Scale 100 Data Set
Specification 2-Scale 1000 Data Set
SQL-1
2.05 
1.00
9.67
SQL-2
0.23 
0.24
2.05
SQL-3
0.71 
0.62
30.46
SQL-4
0.5 
0.38
9.74
SQL-5
1.01 
0.72
11.10
SQL-6
0.06 
0.05
0.58
SQL-7
0.48 
0.40
32.13
SQL-8
0.86 
0.61
16.00
SQL-9
4.2 
3.19
76.98
SQL-10
0.84 
0.64
11.36
SQL-11
0.18 
0.16
2.14
SQL-12
1.76 
1.47
17.03
SQL-13
2.85 
1.60
19.02
SQL-14
0.16 
0.15
1.73
SQL-15
0.25 
0.20
1.66
SQL-16
0.39 
0.35
3.56
SQL-17
0.51 
0.42
12.38
SQL-18
1.72 
1.07
19.64
SQL-19
0.48 
0.28
6.75
SQL-20
0.35 
0.34
12.90
SQL-21
1.74 
0.82
14.61
SQL-22
0.42 
0.39
9.60
Total Time
21.74 
15.09
321.12


Help and Support

Was this page helpful?

Help us improve! Rate your documentation experience in 5 mins.

Feedback