Deep Learning

about gpt-3 again

About GPT-3. Again.

There is a lot of hype about GPT-3 since it appeared. Everybody was so excited about its results. It doesn’t need fine-tuning for some specific task, which is quite an impressive result. Also, it is very good at writing poems, writing code, and SQL queries at the same time. Isn’t ... Read More

Facebook Blender

I’m playing a lot these days with chatbots. I fount this link interesting: About my own experience with Blender: ... Read More
Size does Matter

Size Does Matter

Introduction In a word of NLP and language models, size does matter. Literally, the bigger is better. Since the appearance of transformers on the NLP horizon, they became a de facto synonym for successful language model implementation, shadowing usage of recurrent neural networks, and LSTM. NLP got their equivalent of ... Read More
AI Programming Heroes

My AI Programming Heros During Corona Days

One Long Introduction Currently I’m working on Question and Answering system based on BERT. Some online resource for it: Since it’s announcement and open-sourcing, many optimizations and variations appeared. There are smaller but faster versions optimized for memory footprint and speed for mobile devices. Also, after some research, some variations ... Read More
MS AI news

Finally Something Interesting From Microsoft

Introduction It seemed to me that Google, Facebook, and OpenAI are the only ones that work out something in the AI area. Still, tech giants like Microsoft seem to work on something interesting as well. Recently Microsoft unveiled its Turing-NLG model with its 17B parameters, and it was the most ... Read More
Sexism and Racisam in AI

Sexism and Racism in AI Models

Introduction I wasn’t able to find anything interesting for today’s news. Yet I wanted to write something, so I’m sharing my thoughts with you about my observations and opinions about recent happenings with Image Recognition technology. Today’s post is related to sexism and racism in AI models. In particular, I’m ... Read More
GPT-2 Automatic Code Generation

GPT-2 Writing Python Code

GPT-2 pretrained wih code on GitHub can write code by itself as autocomplete functionality. Watch the video: Just think about GPT-3 scale code generator ๐Ÿ™‚ Sounds scarry to me … ... Read More
Google Reformer

Another Google Gamechanger Explained

Yesterday I wrote about Google’s Reformer in Today I’m sharing some technical information how it is implemented in order to allow input size of 1M of tokes within 16GB of memory. There is a Jupyter notebook that enlights some information, available at Enjoy. ... Read More
Recent AI news

AI news after a while

There were no news on the site for a while since I didn’t find anything with wow effect. However, here are some recent news that occupied my attention. PEGASUS First is PEGASUS. It is Google’s abstractive summarization model, as described in short at More accademic and detailed description is available ... Read More