Replies: 1 comment
-
That would be cool, I gave it a quick try and also ran into issues quickly. The first issue was the installed version of numpy, so I tried installing 1.21.6 and then got more errors. Coqui-TTS does seem to install and run fine, so the issues are with dependencies in epub2tts. Would be cool if anyone familiar with building collab notebooks wanted to take a shot at it, or give any guidance that could help. On the other hand, are notebooks reliable enough with long-running tasks like this? Playing around with it for other projects seemed to be doomed to failure if anything took more than a few minutes. Maybe I just had bad luck though :) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
The tool you've created here is really awesome!
One thing that would be great to have is the ability to run this in Google Collab!
The XTTSv2 model sounds really impressive, but its performance requirement nearly eliminates the possibility of using it (e.g., it's ~1x real-time inference speed on Apple chips 🤕).
As far as I understand, utilizing free T4 GPU + deepspeed on collab allows us to achieve awesome speed (possibly 10x).
Unfortunately, with not much experience with those tools, I couldn't fight some dependency conflicts to make it work.
If someone knows the way to set up the Collab env to meet epub2tts requirements, it would be great if you'd share that notebook ⭐
Beta Was this translation helpful? Give feedback.
All reactions