Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Intermittently unable to insert data #48

Open
theburn opened this issue Mar 18, 2018 · 0 comments
Open

Intermittently unable to insert data #48

theburn opened this issue Mar 18, 2018 · 0 comments

Comments

@theburn
Copy link

theburn commented Mar 18, 2018

The data flow is as follows:

wrk(Modern HTTP benchmarking tool) ---> logstash ----> mongodb

  • Version:
  1. logstash-5.6.7
  2. logstash-output-mongo: 3.1.3
  3. MongoDB: 3.4.13
  • Operating System:
  1. CentOS 7.2 (4 Core and 8G Mem)
  • Config File

jvm、startup、logstash is default config

input {
        http {
            host => "0.0.0.0"
            port => 50001
            type => "metric"
            threads => 256
        }
}

filter {
    
        if [type] == "metric" {

            if [message] == "epoch,value"{
                drop { }
            }

            ruby {

                code => "
                        event.set('insertdate', event.get('@timestamp'))
                        event.set('[@metadata][type]', event.get('type'))
                        event.set('[@metadata][collection]', event.get('collection'))
                        event.set('expireAt', Time.now + (7*24*60*60))
                        "
            }

             mutate {
                split => {"message" => ","}
                add_field => {
                        "time" => "%{message[0]}"
                }
            }

            ruby {
                code => "
                        event.set('value', event.get('message').drop(1))
                        "
            }

            mutate {
                join => {
                    "value" => ","
                }
                
                remove_field => ["host","@timestamp","@version","headers","tags","message","type","collection"]
            }
  
        }
}


output {

        if  [@metadata][type] == "metric"  {
            mongodb {
                bulk => true
                bulk_interval => 1
                bulk_size => 900
                codec => json
                collection => "%{[@metadata][collection]}"
                isodate => true
                database => "metric"
                uri => "mongodb://aaa:[email protected]/admin"
            }
}
  • Sample Data:
{
    "collection":"03406bdd-560f-4525-89a1-8929dbe72e16.app.rsptime'",
    "message":"1488181693.035,xxxxxxxxxxxxxxxxxxxxxxxxxxx..."  # avg length is  512Byte
}
  • Steps to Reproduce:
  1. start logstash by config
  2. wrk process post request
  3. logstash insert data
  4. use mongostat to monitor , a continuous 0 value (insert column)appears

Shell> mongostat -u xxx -p xxxx --authenticationDatabase admin
insert query update delete getmore command ... ...
1000
1440
....
0
0
... // about 1-2 minutes
0
1200
1100
... // about 1-3 minutes
0
0
...
(repeat)

  1. At the moment, mongodb can't query( db.xxx.find() is blocked)

At the same time, I wrote a test program with golang(use bulk), and it's work well,
the value of the inserted column is around 5000,and mongo query is work well

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants