Optimization of English translation model combining deep learning and attention mechanism and its application in cross-cultural communication
Abstrak
Abstract In today’s globalized context, cross-cultural communication is becoming increasingly frequent, and the importance of English translation is becoming more prominent. Although the traditional Transformer architecture achieves global relevance capture through self-attention, its O (n) computational complexity leads to a significant semantic gap in long text translation (for example, the error rate of complex sentence structure reaches 37%), and the translation accuracy of culture-loaded words is insufficient (78.4%). In this study, a DAT-NMT model integrating dual attention mechanism and confrontation training is proposed. Through the dynamic adjustment of word-sentence hierarchical attention (BLEU reaches 45.2 when τ = 0.5) and explicit coding of cultural characteristics, the double breakthrough of semantic preservation and cultural adaptation is realized. Experiments show that the model is improved by 23.8% compared with the baseline on WMT2020 data set, and the accuracy rate of culturally sensitive words is 92.3%, which provides a new paradigm for cross-cultural translation with both efficiency and accuracy. In cross-cultural communication scenarios, DAT-NMT also performs well, better handling texts rich in cultural connotations, reducing translation errors caused by cultural differences, providing more reliable translation support for cross-cultural communication, and opening up new technical directions for the development of the English translation field.
Topik & Kata Kunci
Penulis (1)
Shengqin Bi
Akses Cepat
- Tahun Terbit
- 2025
- Sumber Database
- DOAJ
- DOI
- 10.1007/s44163-025-00511-6
- Akses
- Open Access ✓