Jun, 2023
预训练语言模型迁移学习改进端到端语音摘要
Transfer Learning from Pre-trained Language Models Improves End-to-End Speech Summarization
Kohei Matsuura, Takanori Ashihara, Takafumi Moriya, Tomohiro Tanaka, Takatomo Kano...
TL;DR本文提出在端对端语音摘要(E2E SSum)模型中集成预训练语言模型来解决数据稀缺问题,并通过迁移学习来减少编解码器之间的差距。实验证明,该模型表现优于基线和数据增强模型。