QuantumStat: NLP on Steroids

QuantumStat NLP on steroids

Image by Gerd Altmann from Pixabay


Introduction

Anather post related to NLP and transfer-learning paradigm.

Since at least in the after-BERT era it is not preferable to train your own model for anything related to NLP, the normal way of doing things is:

  1. Find a model that is already fine-tuned for your purpose, and
  2. Use it, running it locally.

The ultimate beauty of this approach lies in the following facts:

  • All of the models are classified in only a few classes
  • Every model of each class can be utilized wih the same API. If you want to make experiments with differen models, you just change the name of the model.

AdapterHub enhanced this concept by providing thin fine-tunes over already fine-tuned models available on HuggingFace.

This article is about Quantum Stat:

Models

Quantum Stat offers NLP resources combined from different sources

https://models.quantumstat.com/

There are a lot of models available for the following NLP tasks, each one of them available with the proper code snipped that allows its utilization:

  • Sequence Classification
  • Token Classification
  • Question Answering
  • Summarization
  • CommonSense
  • Natural Language Inference
  • Automatic Text Generation
  • Conversational
  • Machine Translation
  • Text-to-Speech

The last one is diffeent from the others. It doesn’t provide possibiliy to get a code-snippet for usint it. Instead, it links to the following GitHub repo:

This was a moment of a surprise for me since this is the first time I can see some information about speech recognition after several years.

Obviously, this is a continuation of the process of the implementation of CTC algorithm initiated by Baidy under the lead of Andrew Ng, when they trained a model that was able to recognize Chinesee with better accuracy than the human being. The keyword here is DeepSpeech:

Datasets

Here we can find a great set of 545 datasets that can be used for experiments with various NLP tasks. Each one with a download link and an appropriate pdf article that refers to it.

Colab Notebooks

A great set of 232 Google Colab files, usefull for getting insights into details of the implementaions of various NLP tasks.

Chatbots

This one is great. Even the term “conversational” is covered by the first section, they decided to place separate section related to chatbots, with the possibility to chit-chat with one of several available chatbot models presented on the page.

I tried it, it wasn’t the state-of-the-art experience with chit-chat oriented chatbots. But it was nice to see experience where this technology was headed at about a year ago, without much fine-tuning.

Question and Answering

https://ai1.quantumstat.com/

This is another history lesson ages ago, with the time measured in AI time framers. One year ago, BERT was taking all over the NLP worldwide with his Q&A capabilities. Its famous capabilities can be tried with the total input of 512 tokens, including questions (there may be more than one question related to the same text), and the text where BERT for QA will try to extract the answers.

HuggingFace offers tens of already fine-tuned BERT based models for QA, on different languages and with different datasets: SQUAD, SQUAD2, COVID. Again, nice to see this as a history lesson, just to remember were we were just a year ago 🙂

Cherry on the top

The last thing that was really surprising:

You want to take a realtime look into various tweets, with financial sentiment calculated? Go there!

Conclusion

Quantum Stas is a great place for every NLP enthusiast. So many models, datasets, and Colab notebooks make it worth a bookmark in my browser’s Favorites.

No comment

Leave a Reply

Your email address will not be published. Required fields are marked *