Skip to content

Latest commit

 

History

History
15 lines (14 loc) · 716 Bytes

README.md

File metadata and controls

15 lines (14 loc) · 716 Bytes

Bert-VITS2-Faster

  1. original algorithm repositoryBert-VITS2(v2.0.2)
  2. all to trtfp16 to get 9x faster.

Using

  1. after you get your own trained model. (train_ms.py)
    0.1 prepare your own 'config.yml' as the 'default_config.yml'
  2. sh bert_to_onnx.sh
  3. python infer_torch_export_onnx.py (first comment line: g_model_name = None)
  4. see inputs.py to get trt.engine
  5. python infer_backend_mix.py (test trt infer)

You can do

  1. change the training forward to merge the all 6 submodel to one model to get VITS2.engine.
  2. improve the performance of VITS2 on emotion.
  3. when some submodel severe performance loss, you can replece the trt to onnx.