Java、Python项目辅导找Java1234_小锋老师,专业又靠谱 QQ:3320160706
transformer论文集合 下载
时间:2025-05-26 09:57
来源:http://www.java1234.com
作者:转载
侵权举报
transformer论文集合
| 失效链接处理 |
transformer论文集合 下载
相关截图:
主要内容:
1 Introduction
Transformer has been the most widely used ar-
chitecture for machine translation (Vaswani et al.,
2017). Despite its strong performance, the decod-
ing of Transformer is inefficient as it adopts the
sequential auto-regressive factorization for its prob-
ability model (Figure 1a). Recent work such as
non-autoregressive transformer (NAT), aim to de-
code target tokens in parallel to speed up the gener-
ation (Gu et al., 2018). However, the vanilla NAT
still lags behind Transformer in the translation qual-
ity – with a gap about 7.0 BLEU score. NAT as-
sumes the conditional independence of the target
tokens given the source sentence. We suspect that
NAT’s conditional independence assumption pre-
vents learning word interdependency in the target
sentence. Notice that such word interdependency
is crucial, as the Transformer explicitly captures
that via decoding from left to right (Figure 1a).
|
------分隔线----------------------------
- 关注Java1234微信公众号
-
锋哥推荐