-
-
Notifications
You must be signed in to change notification settings - Fork 416
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Overcome python debug adapter's string truncation limit #191
Comments
Thanks for pointing out https://github.com/elazarcoh/simply-view-image-for-python-debugging! @elazarcoh are you up to collaborate? ;)
Actually, it looks like they are saving the image on disk and then just showing that image. |
Sure, what do you have in mind?
That's right... |
Seems to be 64k characters.
Up to become a co-maintainer of the python visualization capability of this extension? We could schedule a virtual meeting to discuss how we could do that. I'm up to help you with/walk you through all the TypeScript part, it's just that I don't know much about python. |
Would be awesome if the best of both projects could be combined. Saving a temporary json file to disk sounds like a simple solution to overcome the limitations of the python debug adapter. |
This might work for remote debugging, but it feels like hacky solution. Maybe we can use this as fallback, if the output would be too long. |
We can discuss, sure. Currently I'm busy with the other thing I mentioned, but feel free to contact me via mail.
Oh that's a shame. I intended to use base64 for serialization. My first alternative would be to use sockets for communication, but I'm not sure how feasible it is. I'm going to need this anyway, because I'm working on a webview, so I'll keep you posted. |
Alright. I managed to put something that seems to be working for the JSON case. The numpy case is still a WIP (mostly due to complexity of parsing arrays: datatype, byte-order, alignment, etc.). you can see in main.py that I'm sending 6MB worth of JSON text. It received in the message in its entirety in the extension. I'm currently working on integrating it in my extension, but feel free to start playing with it and ask questions. PS usage example in the extension: const socketServer = Container.get(SocketServer);
socketServer.start();
const requestId = 1; // identify my request, will parse in the response
const varName = "foo";
session.customRequest("evaluate", {
`open_send_and_close(${socketServer.portNumber}, ${requestId}, ${varName}, ${0x02})`, // 0x02 is Json
frameId,
context,
} as DebugProtocol.EvaluateArguments) |
@hediet I finished the implementation in my extension. Do you want me to add it to your extension? Can you point me to where this can go? |
The problem with a server however is, that it doesn't run while being debugged. How did you solve that problem? |
I'm not what you mean by that. |
Hey!
Thanks for the great extension! I love how simple it is to add your own custom plots!
I am trying to add more plots for numpy/pytorch arrays, but I quickly ran into the issue of the maximum string length in the debug adapter according to this post.
From some quick tests, I was able to narrow down the maximum supported string length and it seems to be around 64k characters which is not much when you work with large arrays.
Fortunately, it seems like they were able to overcome this issue in this extension. Without really understanding how your and their extensions work, I was hoping that you would be able to find out why they can use larger arrays.
I guess they might be using different data types for communication which are not truncated. But an even better solution would be if they figured out a way to disable the truncation for this extension all together.
The text was updated successfully, but these errors were encountered: