There were no news on the site for a while since I didn’t find anything with wow effect. However, here are some recent news that occupied my attention.
First is PEGASUS. It is Google’s abstractive summarization model, as described in short at
More accademic and detailed description is available at:
Another cool thing from Google in the AI area is Reformer.
Plugged as an important development of Google’s Transformer — the novel neural network architecture for language understanding — Reformer is intended to handle context windows of up to 1 million words, all on a single AI accelerator using only 16GB of memory.https://analyticsindiamag.com/top-machine-learning-projects-launched-by-google-in-2020-till-date/
It is available on GitHub at:
Having in mind that BERT’s input is limited to 512 tokens, handling such amount of words within the same window gave me a WOW effect.
And, for the end, one simple question for thinking about …
Does anyone knows that somebody else is working on AI research these days, putting Google asside?
If so, please let me know.