{{ message }}
how to train a small LM? #2933
VoiceInteli
started this conversation in
General
Replies: 3 comments
-
|
scripts/asr_language_modeling/ngram_lm/train_kenlm.py - https://github.com/NVIDIA/NeMo , it might help you.. |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
Hi, under the recipes/LibriSpeech folder, you will find "LM". Here you can just modify the yaml to create a smaller one :) |
Beta Was this translation helpful? Give feedback.
0 replies
-
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment

Uh oh!
There was an error while loading. Please reload this page.
-
Hi Gurus,
I am trying to train a ASR with Librispeech. I saw the LM on huggingface used for the recognizers are too big. Could you tell me how i can train a smaller LM? Where is the script i can use and change, and the corpora.
Thanks,
Willy
Beta Was this translation helpful? Give feedback.
All reactions