mBART, a new seq2seq multilingual pretraining system for machine translation across 25 languages

Facebook release multilingual Bart
"New research! We’re releasing mBART, a new seq2seq multilingual pretraining system for machine translation across 25 languages. It gives significant improvements for document-level translation and low-resource languages. Read our paper to learn more: https://arxiv.org/pdf/2001.08210.pdf"


Publié

dans

par

Étiquettes :