-
Notifications
You must be signed in to change notification settings - Fork 300
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix the CTC zipformer2 training #1713
base: master
Are you sure you want to change the base?
Conversation
- too many supervision tokens - change filtering rule to `if (T - 2) < len(tokens): return False` - this prevents inf. from appearing in the CTC loss value
workflow with error: https://github.com/k2-fsa/icefall/actions/runs/10348851808/job/28642009312?pr=1713
but the file location maybe to many tests at the same time ? (overloaded HuggingFace ?) |
Hi @csukuangfj , It is solving the issue #1352 My theory is that CTC uses 2 extra symbols at beginning/end of label sequence. Best regards |
Sorry for the late reply. Could you analyze the wave that causes inf loss? Does it contain only a single word or does it contain nothing at all? |
Hi, text: It seems like a better set of BPEs could reduce the number of supervision tokens. I believe the two extra tokens for the CTC loss are the Best regards |
Thanks for sharing! Could you also post the duration of the corresponding wave file? |
This is the corresponding
It is a 1.44 sec long cut inside a very long recording (2.42 hrs). Definitely a data issue. K. |
yes, I think it should be good to filter out such kind of data. |
Hello, is there something needed for this to merge from my side ? |
The root cause is due to bad data. Would it be more appropriate to fix it when preparing the data? The -2 thing is not a constraint for computing the CTC or the transducer loss. |
Well, without that I also did not find any trace of the extra CTC symbols or or similar in the scripts. Could you try to reproduce the issue by adding a training example with a very lenghty transcript ? Best regards, |
if (T - 2) < len(tokens): return False