QuantumStat NLP on steroids

QuantumStat: NLP on Steroids

Introduction Anather post related to NLP and transfer-learning paradigm. Since at least in the after-BERT era it is not preferable to train your own model for anything related to NLP, the normal way of doing things is: Find a model that is already fine-tuned for your purpose, and Use it, ... Read More
zero shot learning

Zero-Shot Text Classification

Introduction: The stone-age AI Era In the beginning, in the AI stone-age era, you take several thousands of observations and train an old-style fully-connected neural network, And, if you were happy, you would have something useful. That time of training our own neural networks is done a long time ago, ... Read More
nlp, transformers and adapter

AdapterHub: New Kid on the Block for Transfer Learning

Introduction First, there was a BERT, at least all of this started with it. It is a general-purpose language model, once pre-trained for different languages and in different size variations. If you wanted to use it for some specific task, you should fine-tune it in a way that you take ... Read More
about gpt-3 again

About GPT-3. Again.

There is a lot of hype about GPT-3 since it appeared. Everybody was so excited about its results. It doesn’t need fine-tuning for some specific task, which is quite an impressive result. Also, it is very good at writing poems, writing code, and SQL queries at the same time. Isn’t ... Read More