Facebook unveils AI-driven instant translator tool

By Rahul Vaimal, Associate Editor
  • Follow author on
Facebook Image
Representational Image

Social media giant Facebook has unveiled its first-ever AI-driven open-source multilingual machine translation (MMT) software which can translate between any pair of 100 languages without relying on English data.

While conventional English-centric translators used to translate the original content to English before translating to the required language, Facebook’s “M2M-100” tool uses machine learning to make translations instantly without the need to be translated into English.

Research Assistant Angela Fan at Facebook remarked the tool as “a major achievement in the last several years of Facebook AI in terms of machine translation.”

The team states that the tool has been trained on a total of 2,200 language directions which is 10 times more than the previous best, English-centric multilingual models. “Deploying M2M-100 will improve the quality of translations for billions of people, especially those that speak low-resource languages,” Facebook AI said in its blog post.

The new tool outperforms English-centric systems by 10 points on the widely used BLEU metric for evaluating machine translations. The social media giant is making make all resources related to the tool including model, training, and evaluation setup public so that “other researchers can reproduce and further advance multilingual models”.

Using novel mining strategies to create translation data, Facebook built the first truly “many-to-many” data set with 7.5 billion sentences for 100 languages.

“We used several scaling techniques to build a universal model with 15 billion parameters, which captures information from related languages and reflects a more diverse script of languages and morphology,” the company said.

One challenge in multilingual translation is that a singular model must capture information in many different languages and diverse scripts. To address this, Facebook saw a clear benefit of scaling the capacity of its model and adding language-specific parameters.

YOU MAY LIKE