BriefGPT.xyz
Jan, 2016
探索用于语音合成的门控循环神经网络
Investigating gated recurrent neural networks for speech synthesis
HTML
PDF
Zhizheng Wu, Simon King
TL;DR
本研究旨在回答两个问题:a)为什么长短期记忆(LSTM)作为一种序列模型在SPSS中表现良好;b)哪个元素(例如,输入门,输出门,遗忘门)最重要。 通过一系列实验以及视觉分析,我们提出了一种简化的架构,比LSTM具有较少的参数,从而大大降低了生成一般的复杂性而不降低质量。
Abstract
Recently,
recurrent neural networks
(RNNs) as powerful sequence models have re-emerged as a potential acoustic model for statistical parametric
speech synthesis
(SPSS). The long short-term memory (LSTM) architect
→