-
Notifications
You must be signed in to change notification settings - Fork 560
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Handling very large json array #403
Comments
Can you post a backtrace with gdb and the type you're deserializing into? |
I don't know the structure upfront, so Does |
@PSeitz no, you need to use gdb in the case of oom or overflows.
|
This example code shows one way to process an array of values without having them all in memory at the same time. |
The program gets killed by a SIGKILL, is it possible to get a stacktrace there? An attached gdb didn't help. @dtolnay |
I'll close this in favor of #404 |
I have a quite large 17gb json file, which is an array of json objects
[ {a:2} ... {b:2} ]
When I try to deserialize the whole object, my machine with 96GB RAM gets an OOM, which is i a little odd. Memory consumption for
serde_json::from_str(&s)
seems to be quite high.I saw there is an StreamDeserializer, although it seems it handles only data in the form of
{a:2} {b:2} ...
https://docs.serde.rs/serde_json/de/struct.StreamDeserializer.html
The text was updated successfully, but these errors were encountered: