最近在项目中碰到了须要批量插入数据的场景,当数据量在20w条的时候,就要花近36s(有索引)java
用MyBatis动态SQL,代码相似这样:sql
<insert id="batchInsert" useGeneratedKeys="true" keyProperty="id">
INSERT INTO sample(X,X,X)
VALUES
<foreach collection="list" item="item" separator=",">
(#{X},#{X},#{X})
</foreach>
</insert>
复制代码
最后拼接出来执行的sql大概这样:app
INSERT INTO sample(X,X,X) VALUES (X,X,X),(X,X,X),(X,X,X) ……
复制代码
若是是查询优化,还能够在索引上下点功夫,可是批量插入的场景下,SQL语句就没有什么商量的余地了,目前也就只能想到将数据集分批批量插入,控制下单次执行SQL的长度而已。难道这就是极限了?测试
在同事的建议下,用了JProfiler看了下call tree的耗时状况,发现近36s的总耗时中,就有10s左右是在作PreparedStatement.setXXX()的赋值操做,抱着试一试的想法直接经过StringBuilder直接append出一条SQL,再去执行,竟然只须要11s左右,足足差了3倍多!虽然能够想到经过MyBatis处理动态SQL会有点损耗,可是差距仍是超出想象的。优化
单独把这个场景抽离出来,作了下实验,过程以下ui
假设要插入的每条记录有10个String类型的字段,每一个字段的数据为“abcdefghijklmnopqrstuvwxyz”,每批插入1w条,共插入50w条.spa
在SpringBootTest下进行还原:code
DROP TABLE IF EXISTS `sample`;
CREATE TABLE `sample` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`col1` varchar(255) COLLATE utf8_bin DEFAULT NULL,
`col2` varchar(255) COLLATE utf8_bin DEFAULT NULL,
`col3` varchar(255) COLLATE utf8_bin DEFAULT NULL,
`col4` varchar(255) COLLATE utf8_bin DEFAULT NULL,
`col5` varchar(255) COLLATE utf8_bin DEFAULT NULL,
`col6` varchar(255) COLLATE utf8_bin DEFAULT NULL,
`col7` varchar(255) COLLATE utf8_bin DEFAULT NULL,
`col8` varchar(255) COLLATE utf8_bin DEFAULT NULL,
`col9` varchar(255) COLLATE utf8_bin DEFAULT NULL,
`col10` varchar(255) COLLATE utf8_bin DEFAULT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_bin;
复制代码
@Data
@Builder
@AllArgsConstructor
@NoArgsConstructor
public class Sample implements Serializable {
private Long id;
private String col1;
private String col2;
private String col3;
private String col4;
private String col5;
private String col6;
private String col7;
private String col8;
private String col9;
private String col10;
}
复制代码
public List<Sample> buildData() {
List<Sample> samples = new ArrayList<>();
String col = "abcdefghijklmnopqrstuvwxyz";
for (int i = 0; i < 500000; i++) {
Sample sample = Sample.builder()
.col1(col)
.col2(col)
.col3(col)
.col4(col)
.col5(col)
.col6(col)
.col7(col)
.col8(col)
.col9(col)
.col10(col)
.build();
samples.add(sample);
}
return samples;
}
复制代码
动态SQL以下:xml
<insert id="batchInsertSamples" parameterType="java.util.List" useGeneratedKeys="true" keyProperty="id">
INSERT INTO sample(col1,col2,col3,col4,col5,col6,col7,col8,col9,col10)
VALUES
<foreach collection="samples" item="item" separator=",">
(#{item.col1},#{item.col2},#{item.col3},#{item.col4},#{item.col5},
#{item.col6},#{item.col7},#{item.col8},#{item.col9},#{item.col10})
</foreach>
</insert>
复制代码
测试代码:对象
@Autowired
private SampleMapper sampleMapper;
@Test
public void testMyBatis() {
List<Sample> samples = buildData();
StopWatch stopWatch = new StopWatch();
System.out.println("开始使用MyBatis进行批量插入");
stopWatch.start();
List<List<Sample>> batch = Lists.partition(samples,10000);
for(List<Sample> part :batch){
sampleMapper.batchInsertSamples(part);
}
stopWatch.stop();
System.out.println("Mybatis批量插入完成,耗时:" + stopWatch.getTotalTimeMillis());
}
复制代码
结果用时26.439s
@Autowired
private JdbcTemplate jdbcTemplate;
@Test
public void testAppend(){
List<Sample> samples = buildData();
String prefix = "INSERT INTO sample(col1,col2,col3,col4,col5,col6,col7,col8,col9,col10) VALUES";
StopWatch stopWatch = new StopWatch();
System.out.println("开始直接拼接sql插入");
stopWatch.start();
List<List<Sample>> batch = Lists.partition(samples,10000);
for(List<Sample> part :batch){
StringBuilder sb = new StringBuilder();
for(Sample sample :part){
sb.append("(");
sb.append("\""+sample.getCol1()+"\"").append(",");
sb.append("\""+sample.getCol2()+"\"").append(",");
sb.append("\""+sample.getCol3()+"\"").append(",");
sb.append("\""+sample.getCol4()+"\"").append(",");
sb.append("\""+sample.getCol5()+"\"").append(",");
sb.append("\""+sample.getCol6()+"\"").append(",");
sb.append("\""+sample.getCol7()+"\"").append(",");
sb.append("\""+sample.getCol8()+"\"").append(",");
sb.append("\""+sample.getCol9()+"\"").append(",");
sb.append("\""+sample.getCol10()+"\"");
sb.append(")");
sb.append(",");
}
String sql = prefix + sb.replace(sb.length()-1,sb.length(),"");
jdbcTemplate.execute(sql);
}
stopWatch.stop();
System.out.println("拼接sql批量插入完成,耗时:" + stopWatch.getTotalTimeMillis());
}
复制代码
结果用时13.473s
2倍之差,仍是挺可观的。
万万没想到一句简单的赋值操做,当数据量大的时候,会有这么多的差距