Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The process exits when a duplicate _id is inserted #77

Open
xjiaqing opened this issue Jan 15, 2021 · 0 comments
Open

The process exits when a duplicate _id is inserted #77

xjiaqing opened this issue Jan 15, 2021 · 0 comments

Comments

@xjiaqing
Copy link

xjiaqing commented Jan 15, 2021

logstash-output-mongodb plugin version: 3.1.6
logstash version: 7.6.2
mongodb server version: 4.0.9

When I am inserting data, if the specified _id conflicts, the process will exit. The exception information is as follows

logstash[16300]: warning: thread "Ruby-0-Thread-84: :1" terminated with exception (report_on_exception is true):
logstash[16300]: Mongo::Error::BulkWriteError: Mongo::Error::BulkWriteError: : E11000 duplicate key error collection: adx_requests_at_hour_11.request_logs_1 index: _id_ dup key: { : "1d2ad817737849978c21a7937859001a" } (11000)
logstash[16300]: validate! at /usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.4/lib/mongo/bulk_write/result.rb:184
logstash[16300]: result at /usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.4/lib/mongo/bulk_write/result_combiner.rb:83
logstash[16300]: execute at /usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.4/lib/mongo/bulk_write.rb:87
logstash[16300]: bulk_write at /usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.4/lib/mongo/collection.rb:591
logstash[16300]: insert_many at /usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.4/lib/mongo/collection.rb:568
logstash[16300]: register at /usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-mongodb-3.1.6/lib/logstash/outputs/mongodb.rb:68
logstash[16300]: each at org/jruby/RubyHash.java:1428
logstash[16300]: register at /usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-mongodb-3.1.6/lib/logstash/outputs/mongodb.rb:66
logstash[16300]: synchronize at org/jruby/ext/thread/Mutex.java:164
logstash[16300]: register at /usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-mongodb-3.1.6/lib/logstash/outputs/mongodb.rb:65
logstash[16300]: [2021-01-14T11:56:30,580][FATAL][logstash.runner          ] An unexpected error occurred! {:error=>#<Mongo::Error::BulkWriteError: Mongo::Error::BulkWriteError: : E11000 duplicate key error collection: adx_requests_at_hour_11.request_logs_1 index: _id_ dup key: { : "1d2ad817737849978c21a7937859001a" } (11000)>, :backtrace=>["/usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.4/lib/mongo/bulk_write/result.rb:184:in `validate!'", "/usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.4/lib/mongo/bulk_write/result_combiner.rb:83:in `result'", "/usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.4/lib/mongo/bulk_write.rb:87:in `execute'", "/usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.4/lib/mongo/collection.rb:591:in `bulk_write'", "/usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/mongo-2.11.4/lib/mongo/collection.rb:568:in `insert_many'", "/usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-mongodb-3.1.6/lib/logstash/outputs/mongodb.rb:68:in `block in register'", "org/jruby/RubyHash.java:1428:in `each'", "/usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-mongodb-3.1.6/lib/logstash/outputs/mongodb.rb:66:in `block in register'", "org/jruby/ext/thread/Mutex.java:164:in `synchronize'", "/usr/local/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-mongodb-3.1.6/lib/logstash/outputs/mongodb.rb:65:in `block in register'"]}
logstash[16300]: connecting to redis: logstash:filter:seller:60006, value: Nox
logstash[16300]: [2021-01-14T11:56:32,204][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
systemd[1]: logstash@adx_request_tsv.service: main process exited, code=exited, status=1/FAILURE

my output conf

mongodb {
    bulk => true
    bulk_interval => 3
    bulk_size => 900
    
    uri => "mongodb://127.0.0.1:10050/adx_requests_at_hour_23"
    database => "adx_requests_at_hour_23"
    collection => "request_logs_%{mongo_i}"
    generateId => false
}

In case #10, I found that the problem has been solved, which makes me confused. Is there some config error?

vitalik-dj added a commit to vitalik-dj/logstash-output-mongodb that referenced this issue Apr 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant