乐闻世界logo
搜索文章和话题

How to bulk insert data in TypeORM?

1个答案

1

When using TypeORM for bulk data insertion, several methods can significantly enhance performance and efficiency. Here are the main approaches:

1. Bulk Operations Using the save Method

TypeORM's save method supports receiving an array of entities, enabling multiple records to be inserted in a single operation. For example:

typescript
import { getConnection } from 'typeorm'; import { User } from './entity/User'; async function insertBulkUsers(usersData: User[]) { const userRepository = getConnection().getRepository(User); await userRepository.save(usersData); }

In this example, usersData is an array containing multiple user entities. This approach significantly reduces the number of database I/O operations compared to individual insertions.

2. Using the insert Method with QueryBuilder

For more complex bulk insertion requirements, QueryBuilder provides a flexible way to construct SQL statements, including bulk inserts:

typescript
import { getConnection } from 'typeorm'; async function insertBulkUsers(usersData: any[]) { await getConnection() .createQueryBuilder() .insert() .into(User) .values(usersData) .execute(); }

In this example, usersData is an array of user data objects, where each element corresponds to a row with keys matching column names and values representing the data.

3. Using Native SQL Queries

For optimal performance, native SQL queries can be executed for bulk insertion:

typescript
import { getConnection } from 'typeorm'; async function insertBulkUsers(usersData: any[]) { const queryRunner = getConnection().createQueryRunner(); await queryRunner.connect(); await queryRunner.startTransaction(); try { for (let userData of usersData) { await queryRunner.query( `INSERT INTO user(name, age) VALUES (?, ?)`, [userData.name, userData.age] ); } await queryRunner.commitTransaction(); } catch (err) { await queryRunner.rollbackTransaction(); } finally { await queryRunner.release(); } }

Using native SQL provides full control over SQL execution but sacrifices many conveniences and security features of the ORM.

4. Performance Considerations

When handling large data volumes, consider these optimizations:

  • Batch Operations: Prioritize batch operations over individual record insertions to minimize I/O overhead.
  • Transaction Management: Use transactions appropriately to reduce intermediate I/O operations.
  • Index Optimization: Ensure database table indexes align with your queries, particularly for fields involved in insertion.
  • Limit Entry Count: For extremely large datasets, batch insertions to avoid overwhelming system resources with a single operation.

By implementing these methods, you can effectively perform bulk data insertion within TypeORM.

2024年6月29日 12:07 回复

你的答案