I have an API, created with TypeScript and Node.js which is connected to the MariaDB database. For database communication I am using „mariadb“ package.
One of the calls accepts payload with the data, that i need to process. Processing the data involves iterating through the payload and creating some new objects that I need to insert into the database.
The question is how I can do this most efficiently. There are two ways, that I have found, how can I do this.
First is to put newly created objects into some array and then use batch() method of the PoolConnection instance to insert list values each as separate row. Like this:
const mariadbPool = mariadb.createPool({...});
const conn = await mariadbPool.getConnection();
...
...
let dataArray: Array<string[]> = [];
payload.forEach((data: DataType) => {
// Push the data to the list
dataArray.push([data.foo, data.bar]);
});
// Insert the data into the database:
await conn.batch("INSERT INTO dbname.tbname (foo, bar) VALUES (?, ?)", dataArray);
await conn.commit();
Second way is to perform query to the database on each iteration. Like this:
const mariadbPool = mariadb.createPool({...});
const conn = await mariadbPool.getConnection();
...
...
payload.forEach((data: DataType) => {
// Insert the data into the database:
await conn.query("INSERT INTO dbname.tbname (foo, bar) VALUES (?, ?)", [data.foo, data.bar]);
});
await conn.commit();
Is there any fundamental difference between the two methods or which method is considered best practice?
>Solution :
Based on the volume of request flowing toward DB, they both can do the job without any problem, but the most efficient way is to use batch query with latest MariaDB according to https://github.com/mariadb-corporation/mariadb-connector-nodejs/blob/HEAD/documentation/batch.md#performance-comparison
