-
-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
should automatically split batches per 500 documents #324
Comments
I managed to solve with I broke my list into smaller pieces and ran several batches. But, the insert command executed in a loop is limited to 500 inserts. Should that be so? Shouldn't the library break automatically? `` // result insert only 500 docs |
@mardonedias hi. thanks for your issue! -- |
It's funny, I might have solved this today without actually running into the issue myself. |
Hello, It is very specific functionality to import this amount of record at once. In my case I need to get data from a spreadsheet that can easily reach 1500 rows. I did a test with the dispatch action ('moduleName / insertBatch', docs). The problem persists, only one batch with 500 records is executed. The problem also occurs when removing dispatch ('moduleName / deleteBatch', ids), only 500 records are removed at a time. I took the test to confirm. In the case of exclusion, I tried to use the same strategy that I used in the insertion, breaking the data into smaller pieces. But, it does not delete more than 500 records in the same command. This tool is exceptional, it makes the job very simple. |
@mardonedias since @louisameline's change is non-breaking, do you want me to push the change to the latest version on NPM so you can test by just updating? Or did you manage to test his forked repository? Edit: yes I see in your screenshot now, you did use his fork. Edit2: // docs should be more than 500
docs.forEach(doc => $store.dispatch('moduleName/insert', doc)) with @louisameline's branch, WITHOUT using insertBatch! Let me know! |
Ancient thread, but the reason is firebase. It only accepts batches of 500 at a time. The best way to do this is to use cloud functions and then use a cloud function to update firebase. I can be more explicit if anyone needs more help. |
I don't think this is fixed in vuex-easy-firestore 🤔 I solved it in magnetar, the predecessor, i auto batch per 500 and continue making API calls until everything is synced. Open to PRS! |
Hello,
I have a problem and I can't solve it. At some point I will need to add what can reach 1000 documents.
The problem is as follows:
either dispatch ('myModule / insert', doc) or dispatch ('moduleName / insertBatch', docs), 500 are inserted. I don't know how to solve it, I don't understand why.
can anybody help me?
The text was updated successfully, but these errors were encountered: