Changing CTC Rules to Reduce Memory Consumption in Training and Decoding – NVIDIA Technical Blog News and tutorials for developers, data scientists, and IT admins 2025-04-02T18:57:57Z http://www.open-lab.net/blog/feed/ Aleksandr Laptev <![CDATA[Changing CTC Rules to Reduce Memory Consumption in Training and Decoding]]> http://www.open-lab.net/blog/?p=54761 2023-12-30T01:55:52Z 2022-09-12T14:30:00Z Loss functions for training automatic speech recognition (ASR) models are not set in stone. The older rules of loss functions are not necessarily optimal....]]> Loss functions for training automatic speech recognition (ASR) models are not set in stone. The older rules of loss functions are not necessarily optimal....Decorative image.

Loss functions for training automatic speech recognition (ASR) models are not set in stone. The older rules of loss functions are not necessarily optimal. Consider connectionist temporal classification (CTC) and see how changing some of its rules enables you to reduce GPU memory, which is required for training and inference of CTC-based models and more. For more information about the��

Source

]]>
0
���˳���97caoporen����