Rakuten AI
Rakuten AI
Who We Are
About Us
Careers
English
日本語
Sakura’s Participation in WAT 2021: Effectiveness of Pretrained Models for Multilingual and Multimodal Machine Translation
8月 2, 2021
—
by
ritwp-kotresha-kb
in
Language Program
Computer Vision
Language Pretrained Model
Natural Language Processing
←
Previous:
Efficient transfer learning for multi-channel convolutional neural networks
Next:
Multimodal Item Classification fully based on Transformers
→