Overview
Scenarios
Product Architecture
Instance Types
Compatibility Notes
Usage specification recommendations
INSERT privilege on the target table.INSERT statements for data insertion. The following are several commonly used insertion methods.INSERT INTO...VALUES is the most basic data insertion method, suitable for inserting a single row of data into a table.INSERT INTO table_name (column1, column2, ..., columnN)VALUES (value1, value2, ..., valueN);
Parameter | Description |
table_name | Target table name |
column1, column2, ..., columnN | Target column name list, with column names separated by commas |
value1, value2, ..., valueN | List of values corresponding one-to-one with column names |
-- Create employee information tableCREATE TABLE employees (id INT PRIMARY KEY AUTO_INCREMENT,name VARCHAR(100) NOT NULL,department VARCHAR(50),salary DECIMAL(10, 2),hire_date DATE);-- Insert an employee recordINSERT INTO employees (name, department, salary, hire_date)VALUES ('Zhang San', 'Technology Department', 15000.00, '2026-01-15');
Query OK, 1 row affected (0.01 sec)
VALUES values in a single INSERT statement. Multi-row insertion is typically more efficient than executing multiple single-row insert statements row by row, as it reduces the number of network interactions between the client and server.INSERT INTO table_name (column1, column2, ..., columnN)VALUES(value1_1, value1_2, ..., value1_N),(value2_1, value2_2, ..., value2_N),...(valueM_1, valueM_2, ..., valueM_N);
INSERT INTO employees (name, department, salary, hire_date)VALUES('Li Si', 'Product Department', 13000.00, '2026-02-01'),('Wang Wu', 'Technology Department', 16000.00, '2026-02-15'),('Zhao Liu', 'Marketing Department', 12000.00, '2026-03-01');
Query OK, 3 rows affected (0.01 sec)Records: 3 Duplicates: 0 Warnings: 0
max_allowed_packet parameter. If the data volume to be inserted is large, it is recommended to execute the operation in batches appropriately.VALUES clause exactly match the order and number of columns in the table, the column name list can be omitted.INSERT INTO table_name VALUES (value1, value2, ..., valueN);
INSERT INTO employees VALUES (NULL, 'Sun Qi', 'Operations Department', 11000.00, '2026-03-10');
VALUES must be provided in full accordance with the order and number of columns in the table structure. For AUTO_INCREMENT columns, NULL or 0 can be used to allow the system to auto-generate values. This approach is not recommended in production environments, as table structure changes may cause the statement to fail.INSERT INTO...SET syntax assigns values column by column in the form of "column_name = value", suitable for scenarios where only specific columns need values. Unspecified columns will use default values or NULL.INSERT INTO table_name SET column1 = value1, column2 = value2, ...;
INSERT INTO employees SET name = 'Zhou Ba', department = 'Technical Department', salary = 14000.00, hire_date = '2026-03-15';
INSERT INTO...SELECT statement is used to insert query results from one table into another, suitable for inter-table data migration or data aggregation.INSERT INTO target_table (column1, column2, ..., columnN)SELECT column1, column2, ..., columnNFROM source_tableWHERE condition;
-- Create an archive tableCREATE TABLE employees_archive LIKE employees;-- Insert Technical Department employee data into the archive tableINSERT INTO employees_archive (name, department, salary, hire_date)SELECT name, department, salary, hire_dateFROM employeesWHERE department = 'Technical Department';
INSERT INTO...SELECT may impose long read locks on the source table in scenarios with large data volumes. It is recommended to execute during off-peak business hours.INSERT IGNORE ignores the conflicting row when a primary key or unique key conflict is encountered, does not insert that row of data, and continues with subsequent insert operations.INSERT IGNORE INTO table_name (column1, column2, ...)VALUES (value1, value2, ...);
-- Assume the record with id=1 already existsINSERT IGNORE INTO employees (id, name, department, salary, hire_date)VALUES (1, 'new employee', 'Technical Department', 20000.00, '2026-03-20');
Query OK, 0 rows affected, 1 warning (0.00 sec)
INSERT IGNORE does not return an error but generates a warning. SHOW WARNINGS can be used to view specific conflict information.INSERT...ON DUPLICATE KEY UPDATE does not ignore the row when a primary key or unique key conflict is encountered. Instead, it performs the update operation specified in the UPDATE clause. This statement is suitable for "insert or update" scenarios.INSERT INTO table_name (column1, column2, ..., columnN)VALUES (value1, value2, ..., valueN)ON DUPLICATE KEY UPDATE column1 = new_value1, column2 = new_value2, ...;
INSERT INTO employees (id, name, department, salary, hire_date)VALUES (1, 'Zhang San', 'Technical Department', 18000.00, '2026-01-15')ON DUPLICATE KEY UPDATE salary = 18000.00;
id=1 exists, after this statement is executed, the salary field will be updated to 18000.00.INSERT...ON DUPLICATE KEY UPDATE may match an unexpected unique key, resulting in updates to unintended rows. It is recommended to use this syntax only on tables with a single unique key.REPLACE INTO deletes the old row and inserts a new one when a primary key or unique key conflict is encountered.REPLACE INTO table_name (column1, column2, ..., columnN)VALUES (value1, value2, ..., valueN);
REPLACE INTO employees (id, name, department, salary, hire_date)VALUES (1, 'Zhang San', 'Technical Department', 20000.00, '2026-01-15');
REPLACE INTO first deletes the old row and then inserts a new row, which is equivalent to performing a DELETE and an INSERT operation. Please confirm whether the business logic allows deleting the old row before using it.INSERT statements row by row is inefficient. TDSQL Boundless supports efficient batch data import through the following methods.LOAD DATA INFILE reads data from a text file and imports it in batches into the target table, which is suitable for large-volume data import scenarios. Compared to row-by-row INSERT, this statement can improve import speed by tens of times.LOAD DATA INFILE 'file_path'INTO TABLE table_nameFIELDS TERMINATED BY ','ENCLOSED BY '"'LINES TERMINATED BY '\\n'IGNORE 1 ROWS;
Parameter | Description |
file_path | Data file path |
FIELDS TERMINATED BY | Field delimiter, for example , indicating comma-separated values |
ENCLOSED BY | Field value enclosure, for example " indicating double quotes |
LINES TERMINATED BY | Line separator, for example \\n representing a newline character |
IGNORE 1 ROWS | Skip the first row of the file (typically the header row) |
LOAD DATA INFILE '/tmp/employees.csv'INTO TABLE employeesFIELDS TERMINATED BY ','ENCLOSED BY '"'LINES TERMINATED BY '\\n'IGNORE 1 ROWS(name, department, salary, hire_date);
LOAD DATA INFILE is used, ensure that the current user has the FILE privilege and that the data file path is accessible by the database server.LOAD DATA INFILE, batch insertion can be achieved by including multiple VALUES clauses within a single INSERT statement.INSERT INTO employees (name, department, salary, hire_date) VALUES('Employee A', 'Technology Department', 15000.00, '2026-01-01'),('Employee B', 'Product Department', 13000.00, '2026-01-02'),('Employee C', 'Marketing Department', 12000.00, '2026-01-03'),...('Employee N', 'Operations Department', 11000.00, '2026-01-30');
INSERT statement contain 100 - 1000 rows of data to avoid excessively large individual statements. You can adjust the maximum allowed size for a single statement by modifying the max_allowed_packet parameter.AUTO_INCREMENT auto-increment column. When data is inserted, auto-increment columns can be handled in the following ways.NULL value for it, and the system will automatically assign an incrementing unique value.-- Omit the auto-increment columnINSERT INTO employees (name, department, salary, hire_date)VALUES ('Test Employee', 'Technology Department', 10000.00, '2026-03-19');-- Specify NULL for the auto-increment columnINSERT INTO employees (id, name, department, salary, hire_date)VALUES (NULL, 'Test Employee 2', 'Technology Department', 10000.00, '2026-03-19');
INSERT INTO employees (id, name, department, salary, hire_date)VALUES (1000, 'Specified ID Employee', 'Technology Department', 10000.00, '2026-03-19');
INSERT can reduce the number of SQL statement parses and network round trips, offering significant performance advantages in batch insertion scenarios.INSERT INTO employees (name, department, salary, hire_date) VALUES('Employee 1', 'Technology Department', 15000.00, '2026-01-01'),('Employee 2', 'Product Department', 13000.00, '2026-01-02'),('Employee 3', 'Marketing Department', 12000.00, '2026-01-03');
INSERT INTO employees (name, department, salary, hire_date) VALUES ('Employee 1', 'Technology Department', 15000.00, '2026-01-01');INSERT INTO employees (name, department, salary, hire_date) VALUES ('Employee 2', 'Product Department', 13000.00, '2026-01-02');INSERT INTO employees (name, department, salary, hire_date) VALUES ('Employee 3', 'Marketing Department', 12000.00, '2026-01-03');
INSERT statements within a single transaction for unified commit, to avoid the additional overhead caused by committing each statement individually.START TRANSACTION;INSERT INTO employees (name, department, salary, hire_date) VALUES ('Employee 1', 'Technology Department', 15000.00, '2026-01-01');INSERT INTO employees (name, department, salary, hire_date) VALUES ('Employee 2', 'Product Department', 13000.00, '2026-01-02');-- More INSERT statementsCOMMIT;
SET autocommit = 0;INSERT INTO employees (name, department, salary, hire_date) VALUES ('Employee 1', 'Technology Department', 15000.00, '2026-01-01');INSERT INTO employees (name, department, salary, hire_date) VALUES ('Employee 2', 'Product Department', 13000.00, '2026-01-02');-- More INSERT statementsCOMMIT;SET autocommit = 1;
LOAD DATA INFILE instead of INSERT statements. LOAD DATA INFILE performs batch writing by directly reading text files, offering significantly higher performance than row-by-row INSERT.INSERT statement, please note the following:INSERT statements are automatically committed in the default autocommit = 1 mode. To enable rollback, use START TRANSACTION or BEGIN to start a transaction before executing the INSERT statement.INT column will result in implicit type conversion or an error.INSERT operations will not trigger the corresponding BEFORE INSERT and AFTER INSERT triggers.NOT NULL constraints, PRIMARY KEY constraints, UNIQUE KEY constraints, or FOREIGN KEY constraints, the system will return an error and abort the execution of the current statement.INSERT IGNORE is used, rows that violate constraints will be skipped without aborting the execution of the entire statement.피드백