-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Compatibility for TF>=2.11 #35
Comments
Better use WSL at the moment to save your time. |
I checked with the author and he said a CPU version TF would be fine in google-deepmind/tapnet#120 (comment), though he hasn't replied to my follow-ups yet. Ideally I prefer just installing one of your wheels and avoid WSL2 since I try to introduce as fewer components as possible for my application. Do you think it's worthwhile to give it a try if GPU version TF is not required? Or do you still think WSL2 is better? Appreciate any comments and suggestions! |
Hi @cloudhan, I have downloaded the Visual Studio 2022 as required in https://docs.nvidia.com/deeplearning/cudnn/latest/reference/support-matrix.html# and have done the following:
And I got below messages for the 2 attempts to run
Below are my
|
Also tried the following and fixed all dependencies issues:
But still got the following:
|
Hi @cloudhan, it seems that others encountered a similar issue in #33. I am currently suspecting 3 things:
Regarding 2, I did the following but got the same error:
My next step is to install a Visual Studio 2019 to check on 3 and see how it goes. In the meantime I appreciate it if you could offer any suggestions! Below is the
|
Hi I am a rookie in this field and I want to run
live_demo.py
from https://github.com/google-deepmind/tapnet on a GPU in Windows. The requirements file involvesjax
andtensorflow
butlive_demo.py
only explicitly importsjax
.I have two questions below:
tensorflow
implied in order to runlive_demo.py
orjax
on a GPU?The text was updated successfully, but these errors were encountered: