You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We use the socket appender w/ logstash layout to send data to our kibana instance, I suspect this is a pretty normal way to do it.
Occasionally we see a failure to write to kibana with the following stack trace:
org.apache.logging.log4j.core.appender.AppenderLoggingException: An exception occurred processing Appender socket
org.apache.logging.log4j.core.appender.AppenderLoggingException: An exception occurred processing Appender socket
at org.apache.logging.log4j.core.appender.DefaultErrorHandler.error(DefaultErrorHandler.java:93)
at org.apache.logging.log4j.core.config.AppenderControl.handleAppenderError(AppenderControl.java:165)
at org.apache.logging.log4j.core.config.AppenderControl.tryCallAppender(AppenderControl.java:158)
at org.apache.logging.log4j.core.config.AppenderControl.callAppender0(AppenderControl.java:129)
at org.apache.logging.log4j.core.config.AppenderControl.callAppenderPreventRecursion(AppenderControl.java:120)
at org.apache.logging.log4j.core.config.AppenderControl.callAppender(AppenderControl.java:84)
at org.apache.logging.log4j.core.config.LoggerConfig.callAppenders(LoggerConfig.java:464)
at org.apache.logging.log4j.core.async.AsyncLoggerConfig.callAppenders(AsyncLoggerConfig.java:127)
at org.apache.logging.log4j.core.config.LoggerConfig.processLogEvent(LoggerConfig.java:448)
at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:431)
at org.apache.logging.log4j.core.async.AsyncLoggerConfig.log(AsyncLoggerConfig.java:121)
at org.apache.logging.log4j.core.async.AsyncLoggerConfig.logToAsyncLoggerConfigsOnCurrentThread(AsyncLoggerConfig.java:169)
at org.apache.logging.log4j.core.async.AsyncLoggerConfigDisruptor$Log4jEventWrapperHandler.onEvent(AsyncLoggerConfigDisruptor.java:111)
at org.apache.logging.log4j.core.async.AsyncLoggerConfigDisruptor$Log4jEventWrapperHandler.onEvent(AsyncLoggerConfigDisruptor.java:97)
at com.lmax.disruptor.BatchEventProcessor.processEvents(BatchEventProcessor.java:168)
at com.lmax.disruptor.BatchEventProcessor.run(BatchEventProcessor.java:125)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: failed serializing JSON
at com.vlkan.log4j2.logstash.layout.LogstashLayout.toByteArray(LogstashLayout.java:214)
at org.apache.logging.log4j.core.appender.AbstractOutputStreamAppender.writeByteArrayToManager(AbstractOutputStreamAppender.java:204)
at org.apache.logging.log4j.core.appender.SocketAppender.directEncodeEvent(SocketAppender.java:459)
at org.apache.logging.log4j.core.appender.AbstractOutputStreamAppender.tryAppend(AbstractOutputStreamAppender.java:190)
at org.apache.logging.log4j.core.appender.AbstractOutputStreamAppender.append(AbstractOutputStreamAppender.java:181)
at org.apache.logging.log4j.core.config.AppenderControl.tryCallAppender(AppenderControl.java:156)
... 14 more
Caused by: java.nio.BufferOverflowException
at java.nio.HeapByteBuffer.put(HeapByteBuffer.java:192)
at com.vlkan.log4j2.logstash.layout.util.ByteBufferOutputStream.write(ByteBufferOutputStream.java:49)
at com.fasterxml.jackson.core.json.UTF8JsonGenerator._flushBuffer(UTF8JsonGenerator.java:2085)
at com.fasterxml.jackson.core.json.UTF8JsonGenerator._writeStringSegment2(UTF8JsonGenerator.java:1362)
at com.fasterxml.jackson.core.json.UTF8JsonGenerator._writeStringSegment(UTF8JsonGenerator.java:1315)
at com.fasterxml.jackson.core.json.UTF8JsonGenerator._writeStringSegments(UTF8JsonGenerator.java:1254)
at com.fasterxml.jackson.core.json.UTF8JsonGenerator.writeString(UTF8JsonGenerator.java:525)
at com.fasterxml.jackson.core.util.JsonGeneratorDelegate.writeString(JsonGeneratorDelegate.java:256)
at com.fasterxml.jackson.core.filter.FilteringGeneratorDelegate.writeString(FilteringGeneratorDelegate.java:371)
at com.vlkan.log4j2.logstash.layout.resolver.StackTraceTextResolver.resolve(StackTraceTextResolver.java:57)
at com.vlkan.log4j2.logstash.layout.resolver.ExceptionResolver$1.lambda$createStackTraceTextResolver$2(ExceptionResolver.java:67)
at com.vlkan.log4j2.logstash.layout.resolver.ExceptionResolver.resolve(ExceptionResolver.java:98)
at com.vlkan.log4j2.logstash.layout.resolver.ExceptionResolver.resolve(ExceptionResolver.java:25)
at com.vlkan.log4j2.logstash.layout.resolver.TemplateResolvers.lambda$ofObjectNode$5(TemplateResolvers.java:146)
at com.vlkan.log4j2.logstash.layout.resolver.TemplateResolvers.lambda$ofObjectNode$5(TemplateResolvers.java:146)
at com.vlkan.log4j2.logstash.layout.LogstashLayout.encode(LogstashLayout.java:264)
at com.vlkan.log4j2.logstash.layout.LogstashLayout.toByteArray(LogstashLayout.java:210)
... 19 more
at com.fasterxml.jackson.core.json.UTF8JsonGenerator._writeStringSegment(UTF8JsonGenerator.java:1315)
at com.fasterxml.jackson.core.json.UTF8JsonGenerator._writeStringSegments(UTF8JsonGenerator.java:1254)
at com.fasterxml.jackson.core.json.UTF8JsonGenerator.writeString(UTF8JsonGenerator.java:525)
at com.fasterxml.jackson.core.util.JsonGeneratorDelegate.writeString(JsonGeneratorDelegate.java:256)
at com.fasterxml.jackson.core.filter.FilteringGeneratorDelegate.writeString(FilteringGeneratorDelegate.java:371)
at com.vlkan.log4j2.logstash.layout.resolver.StackTraceTextResolver.resolve(StackTraceTextResolver.java:57)
at com.vlkan.log4j2.logstash.layout.resolver.ExceptionResolver$1.lambda$createStackTraceTextResolver$2(ExceptionResolver.java:67)
at com.vlkan.log4j2.logstash.layout.resolver.ExceptionResolver.resolve(ExceptionResolver.java:98)
at com.vlkan.log4j2.logstash.layout.resolver.ExceptionResolver.resolve(ExceptionResolver.java:25)
at com.vlkan.log4j2.logstash.layout.resolver.TemplateResolvers.lambda$ofObjectNode$5(TemplateResolvers.java:146)
at com.vlkan.log4j2.logstash.layout.resolver.TemplateResolvers.lambda$ofObjectNode$5(TemplateResolvers.java:146)
at com.vlkan.log4j2.logstash.layout.LogstashLayout.encode(LogstashLayout.java:264)
at com.vlkan.log4j2.logstash.layout.LogstashLayout.toByteArray(LogstashLayout.java:210)
... 19 more
At first I assumed this is a problem in kibana, and simply ignored it, but then I started digging around in the code on the layout. For our configuration, we use maxStringLength = 16000 and maxByteCount = 8000. I noticed in EventResolverContext the writerCapacity looks at both of these values. What ends up happening is the maxStringLength is used, since its greater than 0, and it tries to write 16000 into a 8000 byte buffer, which can fail. See this block for how writer capacity is calculated:
I believe the writer capacity should always be the maxByteCount. The reason we have a max string length has to do with operational asks to not log a single attribute above this size.
The text was updated successfully, but these errors were encountered:
@johnament, thanks for the report and detailed analysis. LogstashLayout is superseded by JsonTemplateLayout of log4j-layout-template-json module since Log4j 2.14.0 release. (I have just placed a warning about this to README.md.) Is that an option for you to switch to JsonTemplateLayout?
@vy yep, i knew you were integrating at some point. from my POV, this isn't critical, but no we aren't planning to upgrade to 2.14 right now. For now we're just removing our maxByteCount to avoid this issue.
We use the socket appender w/ logstash layout to send data to our kibana instance, I suspect this is a pretty normal way to do it.
Occasionally we see a failure to write to kibana with the following stack trace:
At first I assumed this is a problem in kibana, and simply ignored it, but then I started digging around in the code on the layout. For our configuration, we use maxStringLength = 16000 and maxByteCount = 8000. I noticed in
EventResolverContext
the writerCapacity looks at both of these values. What ends up happening is the maxStringLength is used, since its greater than 0, and it tries to write 16000 into a 8000 byte buffer, which can fail. See this block for how writer capacity is calculated:I believe the writer capacity should always be the maxByteCount. The reason we have a max string length has to do with operational asks to not log a single attribute above this size.
The text was updated successfully, but these errors were encountered: