BriefGPT.xyz
Sep, 2018
一个重新排序方案:用于集成大规模自然语言理解模型
A Re-ranker Scheme for Integrating Large Scale NLU models
HTML
PDF
Chengwei Su, Rahul Gupta, Shankar Ananthakrishnan, Spyros Matsoukas
TL;DR
本研究提出了一种新颖的重新排名策略,改善自然语言理解系统(NLU)的性能,并维护了域特定的可拆卸性,增加了模型的校准性和交叉域性能。
Abstract
Large scale
natural language understanding
(NLU) systems are typically trained on large quantities of data, requiring a fast and scalable training strategy. A typical design for
nlu systems
consists of domain-lev
→