I was carrying something when I received a Slack notification from my boss. I tried to reply while walking, but the message ...
Abstract: The Mixture of Experts (MoE) model is a promising approach for handling code-switching speech recognition (CS-ASR) tasks. However, the existing CS-ASR work on MoE has yet to leverage the ...
Abstract: LLMs (Large-scale Language Models), such as ChatGPT, have been widely used for a variety of purposes. The current sophisticated LLMs have a voice mode, which allows a user to input and ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results