Replies: 2 comments
-
@SatishY77 It makes sense in theory and I think you need to try it out to figure out if it works or not. Please keep us posted. Thanks. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Additionally method overrides elements(), toString(), hashCode() but this worked (at least for samples I found). Validating a 2g json file took about 175m memory (9g executed in regular JsonNode). The 175m is probably high because garbage collection wasn’t occurring. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am wondering if I could open a task for supporting large files. This is because json can get very large such as json from twitter. I know xml schema validation does support streaming with a sax source, stream source, stax api. Why does json schema validation not support large files.
Since in some of the use cases I should support do have large files, planning to do a partially streaming approach. First read with a JsonParser, passing events (except in arrays) to a JsonGenerator to produce a JsonNode with blank top level arrays. ArrayNodes would be a custom class I create which would return the original arrays length when size() is called. Then when get(i) gets called I would reparse the original file and return the node in the requested array. The stream would be kept open until the array is completely read. This approach would work (I think) as long as networknt reads arrays sequentially. Do you find any problems with this approach.
Beta Was this translation helpful? Give feedback.
All reactions