Current Topics in Natural Language Processing (SS 2018)

Summary

Deep Learning is an interesting new branch of machine learning where neural networks consisting of multiple layers have shown new generalization capabilities. The seminar will look at advances in both general deep learning approaches, and at the specific case of Neural Machine Translation (NMT). NMT is a new paradigm in data-driven machine translation. In Neural Machine Translation, the entire translation process is posed as an end-to-end supervised classification problem, where the training data is pairs of sentences and the full sequence to sequence task is handled in one model.

Here is a link to last semester's seminar.

There is a Munich interest group for Deep Learning, which has an associated mailing list (initially organized by David Kaumanns), the paper announcements are sent out on this list. See the link here.

Instructors

Alexander Fraser

Email Address: SubstituteLastName@cis.uni-muenchen.de

CIS, LMU Munich


Hinrich Schütze

CIS, LMU Munich

Schedule

NEW: Thursdays 14:45 (s.t.), location is room 123 (or sometimes room 131, which is nearby).

Click here for directions to CIS.

New attendees are welcome. Read the paper and bring a paper or electronic copy with you, you will need to refer to it during the discussion.

If this page appears to be out of date, use the refresh button of your browser

Date Paper Links Discussion Leader
Thursday, April 5th Fabienne Braune, Tobias Eder, Viktor Hangya, Alexander Fraser (2018). Evaluating bilingual word embeddings on the long tail. NAACL 2018 paper Fabienne Braune
Thursday, April 12th Jacob Buckman, Graham Neubig (2018). Neural Lattice Language Models. arXiv 2018 paper Nina Pörner
Thursday, April 19th Kevin Lin, Dianqi Li, Xiaodong He, Zhengyou Zhang, Ming-Ting Sun (2017). Adversarial ranking for language generation. NIPS 2017 paper Mengjie Zhao
Thursday, April 26th Yuval Pinter, Robert Guthrie, Jacob Eisenstein (2017). Mimicking Word Embeddings using Subword RNNs. EMNLP 2017 paper Viktor Hangya
Thursday, May 17th Laura Wendlandt, Jonathan K. Kummerfeld, Rada Mihalcea (2018). Factors Influencing the Surprising Instability of Word Embeddings. NAACL 2018 paper Masoud Jalili Sabet
Thursday, June 14th NAACL trip report and discussion of upcoming papers Hinrich Schütze
Tuesday, June 19th TALK: Prof. Kyunghyun Cho (NYU)
Thursday, June 21st Duygu Ataman, Marcello Federico (2018). Compositional Representation of Morphologically-Rich Input for Neural Machine Translation. ACL 2018 paper Matthias Huck
Thursday, June 28th Peter Shaw, Jakob Uszkoreit, Ashish Vaswani (2018). Self-Attention with Relative Position Representations. NAACL 2018 paper Ben Roth
Thursday, July 5th Jiatao Gu, Hany Hassan, Jacob Devlin, Victor Li (2018). Universal Neural Machine Translation for Extremely Low Resource Languages. NAACL 2018 paper Dario Stojanovski
Thursday, July 12th Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (2017). Learning Important Features Through Propagating Activation Differences. PMLR 2017 paper Alona Sydorova
Thursday, July 26th ACL trip report Hinrich Schütze and others
Thursday, August 9th Alec Radford, Karthik Narasimhan, Tim Salimans, Ilya Sutskever (2018). Improving Language Understanding by Generative Pre-Training. Preprint (June 2018) paper blog post Nina Pörner
Thursday, September 6th Martin Raison, Pierre-Emmanuel Mazaré, Rajarshi Das, Antoine Bordes (2018). Weaver: Deep Co-Encoding of Questions and Documents for Machine Reading. Preprint (April 2018) paper Eduard Saller
Thursday, September 13th Ilya Sutskever, James Martens, Geoffrey Hinton (2011). Generating Text with Recurrent Neural Networks. ICML paper Philipp Dufter
Thursday, September 20th Jie Yang, Shuailong Liang, Yue Zhang (2018). Design Challenges and Misconceptions in Neural Sequence Labeling. COLING (Best Paper) paper Dietrich Trautmann


Further literature:

Please click here for an (old) NMT reading list, but also see the more general RNN reading list here (scroll down). You can also go back through the previous semesters by clicking on the link near the top of the page.