Indexed by:
Abstract:
With the development of information technology, machine translation technologies play a crucial role in cross-language communication. However, there is a problem of information loss in machine translation. In view of the common problem, this paper proposes three Transform-Information Combination (Transformer-IC) models based on information combination method. The models are based on the Transformer and select different middle-layer information to compensate the output through arithmetic mean combination method, linear transformation method and multi-layer information combination method respectively. Experimental results based on Linguistic Data Consortium (LDC) Chinese-to-English corpus and International Workshop on Spoken Language Translation (IWSLT) English-to-German corpus show that the BLEU values of all kinds of Transformer-IC model are higher than that of the reference model, in particular the arithmetic mean combination method improves the BLEU value by 1.9. Compared with the Bert model, the results show that even though the Bert model has a good performance, the Transformer-IC models are better than the Bert model. Transformer-IC models can make full use of the middle-layer information and effectively avoid the problem of information loss. © 2021 IEEE
Keyword:
Reprint 's Address:
Email:
Source :
Year: 2021
Page: 97-101
Language: English
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 0
Affiliated Colleges: