A very short intro
NLP tasks gone all to low-code in the previous year with HuggingFace:
Still, if you want to quickly implement semantic search with transformers, you have a plethora of decisions to make:
- Which BERT-like model to use for closed domain QA
- How to index the content exposed to search and questions
For small volume of texts it is not worth to spend a time on this. You would probably like some library that implements this out-of-the-box.
Here comes txtai
And a few links related to ti:
textai allows not only search and answering. It also provide indexing the text that is exposed, making the search and queries and answering the questions much faster.
I found some unexpected bindings for non-Python languages:
Is AI focus highly biased to Python slowly moving towards JS and Rust? Where is Julia is this picture?
Damn, I’ll miss Python’s tons of libraries for literary speak EVERYTHING, still, don’t expect huge movements here in the next five years. Also, positively surprised with the signs of entering Rust in this area.
txtai comes also with Zero-shot text classification: you simply describe your target classes with a few words, and voilà: there is a classifier out of the box. Welcome to the transformers & BERT magic. Covered here with Zero-Shot Text Classification – AI Daily News (striki.ai)