-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
import_bulk fails to import large input files #41
Comments
Interesting. I haven't tried to use the bulk uploader API call and I haven't seen this issue with the Is the issue that by using a lambda, you're injecting the time to construct the dictionary into the "connect" sequence? Would it be more resilient to just build the dictionary first? Plus, one reason I avoided going down this pat is concerns I had with batching entries (which the external tool already seems to handle.) |
Yeah, the
Yes, it might be that. I can try to build the array first and pass it to the function. If it didn't work properly, I'd go then and then simply run the |
I worked on this issue and tried the following methods:
|
When trying to use the
import_bulk
API frompython-arango
, I noticed that it fails to import all the docs. The following is my input to the function:I have to call
.to_dict
on all objects because they are theIndalekoObjects
class. To make it JSON serializable, we should create a dictionary from them.The error I get is:
Exception: Can't connect to host(s) within limit (3)
The size of the document is
827481
.The text was updated successfully, but these errors were encountered: