Proper way to return tensor from node #124
Answered
by
comfyanonymous
WASasquatch
asked this question in
Q&A
-
I am having issues converting a numpy to a proper tensor that doesn't result in 768 individual slices on the Y axis x.x I am adding a MiDaS depth node to my custom nodes, but can't get a workable image for other nodes. I can save out the image just fine from PIL (below). Example: depth = prediction.cpu().numpy()
depth = (depth * 255 / (np.max(depth)+1)).astype('uint8')
depth_image = Image.fromarray(depth)
depth_image.save('TEST_DEPTH.png')
tensor = torch.from_numpy( depth )
tensors = ( tensor, ) |
Beta Was this translation helpful? Give feedback.
Answered by
comfyanonymous
Mar 17, 2023
Replies: 1 comment 1 reply
-
See the LoadImage node: https://github.com/comfyanonymous/ComfyUI/blob/master/nodes.py#L863 |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
WASasquatch
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
See the LoadImage node: https://github.com/comfyanonymous/ComfyUI/blob/master/nodes.py#L863