Current Topics in Natural Language Processing (WS 2020-2021)

Summary

Deep Learning is an interesting new branch of machine learning where neural networks consisting of multiple layers have shown new generalization capabilities. The seminar will look at advances in both general deep learning approaches, and at the specific case of Neural Machine Translation (NMT). NMT is a new paradigm in data-driven machine translation. In Neural Machine Translation, the entire translation process is posed as an end-to-end supervised classification problem, where the training data is pairs of sentences and the full sequence to sequence task is handled in one model.

Here is a link to last semester's seminar.

There is a Munich interest group for Deep Learning, which has an associated mailing list, the paper announcements are sent out on this list. See the link here.

Instructors

Alexander Fraser

Email Address: SubstituteLastName@cis.uni-muenchen.de

CIS, LMU Munich


Hinrich Schütze

CIS, LMU Munich

Schedule

Thursdays 14:45 (s.t.), location ZOOM ONLINE

You can install the zoom client or click cancel and use browser support (might not work for all browsers).

Contact Alexander Fraser if you need the zoom link.

New attendees are welcome. Read the paper and bring a paper or electronic copy with you, you will need to refer to it during the discussion.

Click here for directions to CIS.

If this page appears to be out of date, use the refresh button of your browser

Date Paper Links Discussion Leader
October 1st Emily M. Bender, Alexander Koller (2020). Climbing towards NLU: On Meaning, Form, and Understanding in the Age of Data. ACL 2020 (best theme paper). paper Kerem Şenel
October 8th Farhad Nooralahzadeh, Giannis Bekoulis, Johannes Bjerva, Isabelle Augenstein (2020). Zero-Shot Cross-Lingual Transfer with Meta Learning. EMNLP 2020. paper Alexandra Chronopoulou
October 15th Vitaly Feldman, Chiyuan Zhang (2020). What Neural Networks Memorize and Why: Discovering the Long Tail via Influence Estimation. arXiv 2020. paper Mengjie Zhao
October 22nd Clara Meister, Tim Vieira, Ryan Cotterell (2020). If beam search is the answer, what was the question? EMNLP 2020 paper Jindřich Libovický
October 29th Micah Shlain, Hillel Taub-Tabib, Shoval Sadde, Yoav Goldberg (2020). Syntactic Search by Example. arXiv 2020 paper Yoav Goldberg
November 5th Ivan Vulić, Edoardo Maria Ponti, Robert Litschko, Goran Glavaš, Anna Korhonen (2020). Probing Pretrained Language Models for Lexical Semantics. EMNLP 2020 paper Viktor Hangya
November 12th Text generation with transformers (see the abstract) abstract Martin Schmitt
November 26th Anonymous (2020). Isotropy in the Contextual Embedding Space: Clusters and Manifolds. Submitted to ICLR 2021. blog paper Sheng Liang
December 3rd Hao Tan, Mohit Bansal (2020). Vokenization: Improving Language Understanding with Contextualized, Visual-Grounded Supervision. EMNLP 2020 paper Timo Schick
December 10th Talk from Volker Tresp (LMU, Siemens). Current work on: The Tensor Brain: Semantic Decoding for Perception and Memory. draft (reading is optional, one option: look at abstract, introduction, conclusion, relationship of model to brain)
December 17th Ideas 2020 and 2021 Denis Peskov
January 21st, 2021 Jonas Pfeiffer, Ivan Vulić, Iryna Gurevych, Sebastian Ruder (2020). MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer. EMNLP paper Philipp Dufter
February 4th, 2021 Invertible Neural Networks blog1 blog2 Haris Jabbar
February 11th, 2021 Yonatan Bisk, Ari Holtzman et al. (2020). Experience Grounds Language. EMNLP 2020 paper Benno Krojer
March 11th, 2021 Suchin Gururangan, Ana Marasović, Swabha Swayamdipta, Kyle Lo, Iz Beltagy, Doug Downey, Noah A. Smith (2020). Don’t Stop Pretraining: Adapt Language Models to Domains and Tasks. ACL 2020 (honorable mention) paper Alexandra Chronopoulou
March 18th, 2021 Devansh Arpit et al. A Closer Look at Memorization in Deep Networks. ICML 2017 paper Mengjie Zhao


Further literature:

You can go back through the previous semesters by clicking on the link near the top of the page.