Deep Learning

wall-e

Now it’s official: We didn’t Understand how Neural Networks work Till Now

First, we created Neural Networks with neurons as building blocks to analogy the neurons in the human brain. Then, we started to think that we can figure out what is going on in the human brain, analyzing the behavior of the artificial neural networks. What arrogance: we are not nearly close to the behavior of the neurons in the human brain and eons far away from figuring out what is going on in the human brains. ... Read More
BERT in Bosnian

BERT in Bosnian

The most advanced NLP language model is available in Bosnian since yesterday. #thisisnotautogeneratedtext Context-aware, preserving learned semantic, and keeps track of the syntax. All in one in his only 87M parameters . Trained in a completely unsupervised manner. i.e. give-him-data-let-him-learn-by-himself. macedonizer/ba-roberta-base · Hugging Face ... Read More
BERT in Greek

BERT in Greek

The most advanced NLP language model is available in Greek since yesterday. #thisisnotautogeneratedtext Context-aware, preserving learned semantic, and keeps track of the syntax. All in one in his only 87M parameters . Trained in a completely unsupervised manner. i.e. give-him-data-let-him-learn-by-himself. macedonizer/gr-roberta-base · Hugging Face ... Read More
BERT in Albanian

BERT in Albanian

The most advanced NLP language model is available in Albanian since yesterday. #thisisnotautogeneratedtext Context-aware, preserving learned semantic, and keeps track of the syntax. All in one in his only 87M parameters . Trained in a completely unsupervised manner. i.e. give-him-data-let-him-learn-by-himself. ... Read More
BERT in Slovenian

BERT in Slovenian

The most advanced NLP language model is available in Slovenian now. #thisisnotautogeneratedtext The most advanced NLP language model is available in Slovenian since yesterday. Context-aware, preserving learned semantic, and keeps track of the syntax. All in one in his only 87M parameters . Trained in a completely unsupervised manner. i.e. ... Read More
BERT in Croatian

BERT in Croatian

The most advanced NLP language model is available in Croatian now. #thisisnotautogeneratedtext The most advanced NLP language model is available in Croatian, since yesterday. Context-aware, preserving learned semantic, and keeps track of the syntax. All in one in his only 87M parameters . Trained in a completely unsupervised manner. i.e. ... Read More
GPT-2 in Serbian

BERT in Serbian

The most advanced NLP language model is available in Serbian since yesterday. #thisisnotautogeneratedtext Context-aware, preserving learned semantic, and keeps track of the syntax. All in one in his only 87M parameters . Trained in a completely unsupervised manner. i.e. give-him-data-let-him-learn-by-himself. macedonizer/sr-roberta-base · Hugging Face ... Read More
GPT-2 in Macedonian

BERT in Macedonian

The most advanced NLP language model is available in Macedonian. #thisisnotautogeneratedtext The most advanced NLP language model is available in Macedonian, since yesterday. Context-aware, preserving learned semantic, and keeps track of the syntax. All in one in his only 87M parameters :-). Trained in a completely unsupervised manner. i.e. give-him-data-let-him-learn-by-himself. ... Read More