Current Topics in Natural Language Processing (SS 2020)

Summary

Deep Learning is an interesting new branch of machine learning where neural networks consisting of multiple layers have shown new generalization capabilities. The seminar will look at advances in both general deep learning approaches, and at the specific case of Neural Machine Translation (NMT). NMT is a new paradigm in data-driven machine translation. In Neural Machine Translation, the entire translation process is posed as an end-to-end supervised classification problem, where the training data is pairs of sentences and the full sequence to sequence task is handled in one model.

Here is a link to last semester's seminar.

There is a Munich interest group for Deep Learning, which has an associated mailing list, the paper announcements are sent out on this list. See the link here.

Instructors

Alexander Fraser

Email Address: SubstituteLastName@cis.uni-muenchen.de

CIS, LMU Munich


Hinrich Schütze

CIS, LMU Munich

Schedule

Thursdays 14:45 (s.t.), location ZOOM ONLINE (was room 115)

You can install the zoom client or click cancel and use browser support (might not work for all browsers).

Contact Alexander Fraser if you need the zoom link.

New attendees are welcome. Read the paper and bring a paper or electronic copy with you, you will need to refer to it during the discussion.

Click here for directions to CIS.

If this page appears to be out of date, use the refresh button of your browser

Date Paper Links Discussion Leader
May 7th Jonathan Clark, Eunsol Choi, Michael Collins et al. (2020). TyDi QA: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages. TACL 2020. paper Denis Peskov
May 14th Alex Tamkin, Trisha Singh, Davide Giovanardi, Noah Goodman (2020). Investigating Transferability in Pretrained Language Models. arXiv. paper Antonis Maronikolakis
June 18th Automated Machine Learning (AutoML). See abstract for reading. abstract Pranav Rai
June 25th Tom B. Brown, Benjamin Mann, [Open AI Team] (2020). Language Models are Few-Shot Learners. arXiv 2020. paper Hinrich Schütze
July 16th Zirui Wang, Jiateng Xie, Ruochen Xu, Yiming Yang, Graham Neubig, Jaime Carbonell (2020). Cross-lingual Alignment vs Joint Training: A Comparative Study and A Simple Unified Framework. ICLR 2020. paper Viktor Hangya
July 23rd ACL trip reports Hinrich Schütze
July 30th Francis Willett, Donald Avansino, Leigh Hochberg, Jaimie Henderson, Krishna Shenoy (2020). High-performance brain-to-text communication via imagined handwriting. bioRxiv 2020. paper Alex Fraser
August 6th Kelvin Guu, Kenton Lee, Zora Tung, Panupong Pasupat, Ming-Wei Chang (2020). REALM: Retrieval-Augmented Language Model Pre-Training. arXiv 2020. paper Timo Schick
August 13th Lei Yu et al. (2020). Better Document-Level Machine Translation with Bayes' Rule. TACL. paper Dario Stojanovski
August 20th Zied Bouraoui, Jose Camacho-Collados, Steven Schockaert (2020). Inducing Relational Knowledge from BERT. AAAI 2020. paper Martin Schmitt
September 24th Angela Fan, Thibaut Lavril, Edouard Grave, Armand Joulin, Sainbayar Sukhbaatar (2020). Accessing Higher-level Representations in Sequential Transformers with Feedback Memory. arXiv 2020. paper Leonie Weißweiler


Further literature:

You can go back through the previous semesters by clicking on the link near the top of the page.