Multilingual DistilWhisper: Efficient Distillation of Multi-task Speech Models via Language-Specific Experts
ICASSP, 2024
TL;DR: Proposes a lightweight adaptation method bridging the gap between small and large speech models on under-represented languages by leveraging language-specific experts and knowledge distillation, outperforming fine-tuning and LoRA with minimal overhead.