Concepedia

Publication | Open Access

Adaptation of Deep Bidirectional Multilingual Transformers for Russian Language

257

Citations

8

References

2019

Year

TLDR

Pre‑trained bidirectional language models achieve state‑of‑the‑art results on tasks such as reading comprehension, natural language inference, and sentiment analysis, and while monolingual models outperform in language‑specific settings, multilingual models enable cross‑lingual transfer and are available as open‑source Russian models. The study introduces methods for adapting multilingual masked language models to a specific language. The authors propose fine‑tuning multilingual masked language models on language‑specific data to create a monolingual model. Transfer learning from a multilingual to a monolingual model yields substantial performance gains on reading comprehension, paraphrase detection, and sentiment analysis, and also markedly reduces training time.

Abstract

The paper introduces methods of adaptation of multilingual masked language models for a specific language. Pre-trained bidirectional language models show state-of-the-art performance on a wide range of tasks including reading comprehension, natural language inference, and sentiment analysis. At the moment there are two alternative approaches to train such models: monolingual and multilingual. While language specific models show superior performance, multilingual models allow to perform a transfer from one language to another and solve tasks for different languages simultaneously. This work shows that transfer learning from a multilingual model to monolingual model results in significant growth of performance on such tasks as reading comprehension, paraphrase detection, and sentiment analysis. Furthermore, multilingual initialization of monolingual model substantially reduces training time. Pre-trained models for the Russian language are open sourced.

References

YearCitations

Page 1