BERT in Serbian

GPT-2 in Serbian

GPT-2 in Serbian


The most advanced NLP language model is available in Serbian since yesterday.

#thisisnotautogeneratedtext

Context-aware, preserving learned semantic, and keeps track of the syntax. All in one in his only 87M parameters 🙂. Trained in a completely unsupervised manner. i.e. give-him-data-let-him-learn-by-himself.

macedonizer/sr-roberta-base · Hugging Face

No comment

Leave a Reply

Your email address will not be published. Required fields are marked *

4 + thirteen =