mongoose 如何一次性更新大量文档?急。
在做 node 爬虫。
目前用的是 BulkWrite
:
const Book = mongoose.model('Book', bookSchema);
exports.saveAll = (from_en, books) => {
const bulkWrite = books.map(book => ({
replaceOne: {
filter: {
from_en,
originId: book.originId
},
replacement: book,
upsert: true
}
}))
return Book.bulkWrite(bulkWrite).catch(error => console.error(error))
}
然后发现,这么处理 11200 条数据耗时 600s:
catId: 82 from 5040 to 5600. crawl cost: 10.1min, dataTotal: 11200, upsertTotal: 11000, matchTotal: 200
mongodb is disonnected
mongodb: 603757.883ms
✨ Done in 604.47s.
这个该如何优化?
下面是部分爬虫逻辑的代码: while 内部的代码
机子性能:I7 6700HQ / 16G RAM