Serbian Automatic Text Generator Released

GPT-2 in Serbian

GPT-2 in Serbian


Till several weeks: GPT-2 models only in English, German, French, Chinese, Japanese, Arabic, Turkish, Italian, Portuguese, Persian, Dutch, Indonesian, Russian, Spanish, and Macedonian.

For several weeks ago: GPT-2 model in the Serbian language as well. AI/Deep-Learning based, trained in an entirely self-supervised way.

Check it out at https://huggingface.co/macedonizer/sr-gpt2?text=Ја+сам+био

I know, it is not that good as its English counterpart. Being trained only with 900 MB of Serbian Wikipedia content, both in Cyrillic and Latin.

However, it can be used in a decent manner for fine tunning with the writings of some famous Serbian authors. I promise.

No comment

Leave a Reply

Your email address will not be published. Required fields are marked *

1 + 20 =