BERT

BERT in Bosnian

BERT in Bosnian

The most advanced NLP language model is available in Bosnian since yesterday. #thisisnotautogeneratedtext Context-aware, preserving learned semantic, and keeps track of the syntax. All in one in his only 87M parameters . Trained in a completely unsupervised manner. i.e. give-him-data-let-him-learn-by-himself. macedonizer/ba-roberta-base · Hugging Face ... Read More
BERT in Greek

BERT in Greek

The most advanced NLP language model is available in Greek since yesterday. #thisisnotautogeneratedtext Context-aware, preserving learned semantic, and keeps track of the syntax. All in one in his only 87M parameters . Trained in a completely unsupervised manner. i.e. give-him-data-let-him-learn-by-himself. macedonizer/gr-roberta-base · Hugging Face ... Read More
BERT in Albanian

BERT in Albanian

The most advanced NLP language model is available in Albanian since yesterday. #thisisnotautogeneratedtext Context-aware, preserving learned semantic, and keeps track of the syntax. All in one in his only 87M parameters . Trained in a completely unsupervised manner. i.e. give-him-data-let-him-learn-by-himself. ... Read More
BERT in Slovenian

BERT in Slovenian

The most advanced NLP language model is available in Slovenian now. #thisisnotautogeneratedtext The most advanced NLP language model is available in Slovenian since yesterday. Context-aware, preserving learned semantic, and keeps track of the syntax. All in one in his only 87M parameters . Trained in a completely unsupervised manner. i.e. ... Read More
BERT in Croatian

BERT in Croatian

The most advanced NLP language model is available in Croatian now. #thisisnotautogeneratedtext The most advanced NLP language model is available in Croatian, since yesterday. Context-aware, preserving learned semantic, and keeps track of the syntax. All in one in his only 87M parameters . Trained in a completely unsupervised manner. i.e. ... Read More
GPT-2 in Serbian

BERT in Serbian

The most advanced NLP language model is available in Serbian since yesterday. #thisisnotautogeneratedtext Context-aware, preserving learned semantic, and keeps track of the syntax. All in one in his only 87M parameters . Trained in a completely unsupervised manner. i.e. give-him-data-let-him-learn-by-himself. macedonizer/sr-roberta-base · Hugging Face ... Read More
GPT-2 in Macedonian

BERT in Macedonian

The most advanced NLP language model is available in Macedonian. #thisisnotautogeneratedtext The most advanced NLP language model is available in Macedonian, since yesterday. Context-aware, preserving learned semantic, and keeps track of the syntax. All in one in his only 87M parameters :-). Trained in a completely unsupervised manner. i.e. give-him-data-let-him-learn-by-himself. ... Read More
zero shot learning

Zero-Shot Text Classification

Introduction: The stone-age AI Era In the beginning, in the AI stone-age era, you take several thousands of observations and train an old-style fully-connected neural network, And, if you were happy, you would have something useful. That time of training our own neural networks is done a long time ago, ... Read More
nlp, transformers and adapter

AdapterHub: New Kid on the Block for Transfer Learning

Introduction First, there was a BERT, at least all of this started with it. It is a general-purpose language model, once pre-trained for different languages and in different size variations. If you wanted to use it for some specific task, you should fine-tune it in a way that you take ... Read More