Replies: 2 comments
-
sadly no answers on this. i was also looking into this in order to do very fast training on very small data sets, i wonder how i could do this. |
Beta Was this translation helpful? Give feedback.
-
I played around with LATENT_SIZE parameter when training with Wasserstein encoder because the default value of 128 resulted in models that I couldn't put to use (nn~ objects with the according number of in-/outlets which crashed PD on DSP processing). The new default of 16 worked just fine. At some point i'd need to understand or check in detail how many latent dimensions would make sense on which occasion (e.g. size and quality ("simple sounds" vs. full length tracks) of training data) - if anyone has recommendations or feedback on this matter, it'd be appreciated. One more question: when training full length tracks, would it be recommended to set a longer value on N_SIGNAL parameter to make sure a representative length of development within the piece is being processed? |
Beta Was this translation helpful? Give feedback.
-
Hey all, I was thinking this would be a useful documentation post for RAVE users out there.
Can you provide some intuition on when or why it is advisable to modify some of the more interesting training parameters? Does anyone have any experience modifying the following parameters outside of their defaults? Could you share why you changed these values and what the intended effects were?
For example:
Beta Was this translation helpful? Give feedback.
All reactions