Java: 使用 PreparedStatement 在 MySQL 中插入多行

我想使用 Java 一次将多行插入到一个 MySQL 表中。行数是动态的。过去我是做..。

for (String element : array) {
myStatement.setString(1, element[0]);
myStatement.setString(2, element[1]);


myStatement.executeUpdate();
}

我想优化一下,使用支持 MySQL 的语法:

INSERT INTO table (col1, col2) VALUES ('val1', 'val2'), ('val1', 'val2')[, ...]

但对于 PreparedStatement,我不知道有什么方法可以做到这一点,因为我事先不知道 array将包含多少元素。如果使用 PreparedStatement是不可能的,那么我还能怎么做(同时还要转义数组中的值) ?

178456 次浏览

You can create a batch by PreparedStatement#addBatch() and execute it by PreparedStatement#executeBatch().

Here's a kickoff example:

public void save(List<Entity> entities) throws SQLException {
try (
Connection connection = database.getConnection();
PreparedStatement statement = connection.prepareStatement(SQL_INSERT);
) {
int i = 0;


for (Entity entity : entities) {
statement.setString(1, entity.getSomeProperty());
// ...


statement.addBatch();
i++;


if (i % 1000 == 0 || i == entities.size()) {
statement.executeBatch(); // Execute every 1000 items.
}
}
}
}

It's executed every 1000 items because some JDBC drivers and/or DBs may have a limitation on batch length.

See also:

If you can create your sql statement dynamically you can do following workaround:

String myArray[][] = { { "1-1", "1-2" }, { "2-1", "2-2" }, { "3-1", "3-2" } };


StringBuffer mySql = new StringBuffer("insert into MyTable (col1, col2) values (?, ?)");


for (int i = 0; i < myArray.length - 1; i++) {
mySql.append(", (?, ?)");
}


myStatement = myConnection.prepareStatement(mySql.toString());


for (int i = 0; i < myArray.length; i++) {
myStatement.setString(i, myArray[i][1]);
myStatement.setString(i, myArray[i][2]);
}
myStatement.executeUpdate();

It is possible to submit multiple updates in JDBC.

We can use Statement, PreparedStatement, and CallableStatement objects for batch update with disabled auto-commit.

addBatch() and executeBatch() functions are available with all statement objects to have BatchUpdate.

Here addBatch() method adds a set of statements or parameters to the current batch.

When MySQL driver is used you have to set connection param rewriteBatchedStatements to true ( jdbc:mysql://localhost:3306/TestDB?**rewriteBatchedStatements=true**).

With this param the statement is rewritten to bulk insert when table is locked only once and indexes are updated only once. So it is much faster.

Without this param only advantage is cleaner source code.

In case you have auto increment in the table and need to access it.. you can use the following approach... Do test before using because getGeneratedKeys() in Statement because it depends on driver used. The below code is tested on Maria DB 10.0.12 and Maria JDBC driver 1.2

Remember that increasing batch size improves performance only to a certain extent... for my setup increasing batch size above 500 was actually degrading the performance.

public Connection getConnection(boolean autoCommit) throws SQLException {
Connection conn = dataSource.getConnection();
conn.setAutoCommit(autoCommit);
return conn;
}


private void testBatchInsert(int count, int maxBatchSize) {
String querySql = "insert into batch_test(keyword) values(?)";
try {
Connection connection = getConnection(false);
PreparedStatement pstmt = null;
ResultSet rs = null;
boolean success = true;
int[] executeResult = null;
try {
pstmt = connection.prepareStatement(querySql, Statement.RETURN_GENERATED_KEYS);
for (int i = 0; i < count; i++) {
pstmt.setString(1, UUID.randomUUID().toString());
pstmt.addBatch();
if ((i + 1) % maxBatchSize == 0 || (i + 1) == count) {
executeResult = pstmt.executeBatch();
}
}
ResultSet ids = pstmt.getGeneratedKeys();
for (int i = 0; i < executeResult.length; i++) {
ids.next();
if (executeResult[i] == 1) {
System.out.println("Execute Result: " + i + ", Update Count: " + executeResult[i] + ", id: "
+ ids.getLong(1));
}
}
} catch (Exception e) {
e.printStackTrace();
success = false;
} finally {
if (rs != null) {
rs.close();
}
if (pstmt != null) {
pstmt.close();
}
if (connection != null) {
if (success) {
connection.commit();
} else {
connection.rollback();
}
connection.close();
}
}
} catch (SQLException e) {
e.printStackTrace();
}
}

@Ali Shakiba your code needs some modification. Error part:

for (int i = 0; i < myArray.length; i++) {
myStatement.setString(i, myArray[i][1]);
myStatement.setString(i, myArray[i][2]);
}

Updated code:

String myArray[][] = {
{"1-1", "1-2"},
{"2-1", "2-2"},
{"3-1", "3-2"}
};


StringBuffer mySql = new StringBuffer("insert into MyTable (col1, col2) values (?, ?)");


for (int i = 0; i < myArray.length - 1; i++) {
mySql.append(", (?, ?)");
}


mysql.append(";"); //also add the terminator at the end of sql statement
myStatement = myConnection.prepareStatement(mySql.toString());


for (int i = 0; i < myArray.length; i++) {
myStatement.setString((2 * i) + 1, myArray[i][1]);
myStatement.setString((2 * i) + 2, myArray[i][2]);
}


myStatement.executeUpdate();

This might be helpful in your case of passing array to PreparedStatement.

Store the required values to an array and pass it to a function to insert the same.

String sql= "INSERT INTO table (col1,col2)  VALUES (?,?)";
String array[][] = new String [10][2];
for(int i=0;i<array.size();i++){
//Assigning the values in individual rows.
array[i][0] = "sampleData1";
array[i][1] = "sampleData2";
}
try{
DBConnectionPrepared dbcp = new DBConnectionPrepared();
if(dbcp.putBatchData(sqlSaveAlias,array)==1){
System.out.println("Success");
}else{
System.out.println("Failed");
}
}catch(Exception e){
e.printStackTrace();
}

putBatchData(sql,2D_Array)

public int[] putBatchData(String sql,String args[][]){
int status[];
try {
PreparedStatement stmt=con.prepareStatement(sql);
for(int i=0;i<args.length;i++){
for(int j=0;j<args[i].length;j++){
stmt.setString(j+1, args[i][j]);
}
stmt.addBatch();
stmt.executeBatch();
stmt.clearParameters();
}
status= stmt.executeBatch();
} catch (Exception e) {
e.printStackTrace();
}
return status;
}