Menu

Blog

Feb 27, 2021

Researchers examine how multilingual BERT models encode grammatical features

Posted by in category: robotics/AI

Over the past few decades, researchers have developed deep neural network-based models that can complete a broad range of tasks. Some of these techniques are specifically designed to process and generate coherent texts in multiple languages, translate texts, answer questions about a text and create summaries of news articles or other online content.

Comments are closed.