Facebook research suggests pretrained AI models can be easily adapted to new languages

Multilingual masked language modeling involves training an AI model on text from several languages, and it’s a technique that’s been used to great effect. In March, a team introduced an architecture that can jointly learn sentence representations for 93 languages belonging to more than 30 different families. But most previous work in language modeling has investigated cross-lingual […]

Read more