Till several weeks: GPT-2 models only in English, German, French, Chinese, Japanese, Arabic, Turkish, Italian, Portuguese, Persian, Dutch, Indonesian, Russian, Spanish, Macedonian, and Serbian.
For several weeks ago: GPT-2 model in the Croatian language as well. AI/Deep-Learning based, trained in an entirely self-supervised way.
Check it out at https://huggingface.co/macedonizer/hr-gpt2?text=Ja+sam+bio
I know, it is not that good as its English counterpart. Being trained only with all of Croatian Wikipedia content.
However, it can be used in a decent manner for fine tunning with the writings of some famous Croatian authors. I promise.