AI news after a while

Recent AI news

Recent AI news


There were no news on the site for a while since I didn’t find anything with wow effect. However, here are some recent news that occupied my attention.

PEGASUS

First is PEGASUS. It is Google’s abstractive summarization model, as described in short at

https://ai.googleblog.com/2020/06/pegasus-state-of-art-model-for.html

More accademic and detailed description is available at:

https://arxiv.org/pdf/1912.08777.pdf

Reformer

Another cool thing from Google in the AI area is Reformer.

Plugged as an important development of Google’s Transformer — the novel neural network architecture for language understanding — Reformer is intended to handle context windows of up to 1 million words, all on a single AI accelerator using only 16GB of memory.

https://analyticsindiamag.com/top-machine-learning-projects-launched-by-google-in-2020-till-date/

It is available on GitHub at:

https://github.com/google/trax/tree/master/trax/models/reformer

Having in mind that BERT’s input is limited to 512 tokens, handling such amount of words within the same window gave me a WOW effect.

And, for the end, one simple question for thinking about …

Does anyone knows that somebody else is working on AI research these days, putting Google asside?

If so, please let me know.

1 Comment

Leave a Reply

Your email address will not be published. Required fields are marked *