Frontpage‎ > ‎Courses‎ > ‎

Fall 2016 DS-GA 3001 <Natural Language Understanding with Distributed Representations>

This year I am trying Piazza. Please go to the following page to check the up-to-date syllabus:


https://piazza.com/nyu/fall2016/dsga3001/home


You will be automatically enrolled, if you have officially enrolled to the course.

Overview

How should natural languages be understood and analyzed? In this course, we will examine some of the modern computational approaches, mainly using deep learning, to understanding, processing and using natural languages. Unlike conventional approaches to language understanding, we will focus on how to represent and manipulate linguistic symbols in a continuous space.

Target Audience

The course is mainly intended for master- and doctorate-level students in computer science and data science. The number of seats is limited, and the priority is given to the students enrolled in the master’s programme at the Center for Data Science and those in the Ph.D. programme of the Department of Computer Science, Courant Institute of Mathematical Sciences.

General Information

  • Lecture: 7.10pm - 9.00pm on Tuesdays at SILVER CENTER, ROOM 405

  • Laboratory: 8.00pm - 9.00pm  on Wednesdays at TBA

  • Office Hours

    • Instructor: 6.00pm - 7.00pm (location: Office 1001 on 715 Broadway)

    • TA: TBA (location: TBA)

  • Grading: Prerequisite Exam (10%) + Lab Assignments (25%) + Final Project (25%) + Final Exam (40%)

Prerequisites

A student is expected to be familiar with the following topics:

  • Undergraduate level Probability and Statistics

  • Undergraduate level Linear Algebra

  • Undergraduate level Calculus

  • Machine Learning: DS-GA-1003 or CSCI-UA.0480-​007


A student is encouraged to try the following languages/frameworks in advance:


A student is expected to have taken the following courses before taking this course:

  • DS-GA-1002: Statistical and Mathematical Methods

  • DS-GA-1003: Machine Learning and Computational Statistics


This course is complementary to

Schedule (Draft)

Week

Lecture

Lab

Reading List

6 Sep

1. Introduction

2. Guest Lecture by Prof. Sam Bowman (CDS & Linguistics)


Preliminary Exam Solution by TA’s



- Ch. 1 of <Foundations of Statistical Natural Language Processing> by Manning and Schuetze. 1999 (2001). (accessible from NYU)

- Sec. 1.1.1 and 1.1.2 of <Procedures as a Representation for Data in a Computer Program for Understanding Natural Language> by Terry Winograd. 1971.

- <Aetherial Symbols> by Geoff Hinton

- <A Review of B. F. Skinner's Verbal Behavior> by Noam Chomsky. 1967.


13 Sep

Machine Learning and Neural Networks

Document classification with n-gram classifiers

1. Logistic Regression

2. Multilayer Perceptron


Lab Assignment 1



1. Video lectures by Hugo Larochelle: 1.1 - 2.11

20 Sep

27 Sep

Recurrent neural networks: Basic, Time series and its modelling

Document classification with recurrent networks

1. RNN Classifier

2. Tanh, GRU, LSTM


Lab Assignment 2




4 Oct

11 Oct

Language Modeling &

Continuous space representation

Language modelling

1. n-gram language modelling with KenLM

2. Feedforward language modelling


Lab Assignment 3

18 Oct

1. <From Sequence Modeling to Translation> by Kyunghyun Cho

2. <From language modeling to machine translation> by Blunsom at DLSS@Montreal 2015

3. <A neural probabilistic language model> by Bengio et al.

4. <Three New Graphical Models for Statistical Language Modelling> by Mnih and Hinton (2007)

25 Oct

1. <Aetherial Symbols> by Geoff Hinton

2. <Deep Consequences-Why Neural Networks are Good for (Language) Science> by Felix Hill

3. <The Lighthill Debate (1973)>

4. “Every time I fire a linguist, the performance of the recognizer goes up” by Fred Jelinek, 1998

5. Warren Weaver memorandum, July 1949

1 Nov

Q&A Session by TA’s

Recurrent Language Modelling


Lab Assignment 4

8 Nov

Neural machine translation

15 Nov

Q&A


1. Sec. 18.8 of <An introduction to machine translation> by W. John Hutchins and Harold L. Somers

2. Warren Weaver memorandum, July 1949

3. Introduction to Neural Machine Translation with GPUs (Parts 1, 2 and 3) by Kyunghyun Cho

22 Nov

Text as a sequence of words, morphemes or characters?

Q&A


29 Nov

1. Connecting dots: sequence modelling to reinforcement learning

2. Guest Lecture: TBA



Q&A


1. The Bandwagon by Claude Shannon.

6 Dec

Break (NIPS 2015)

Deadline for Assignment

13 Dec

Final Exam



Lab Assignments

First of all, it is mandatory to attend the first ten lab sessions. Missing any of these sessions will result in a lower grade/score.


There will be four lab assignments during these ten lab sessions:


  1. Convolutional neural network for document classification

    1. TA in charge: Tian Wang

    2. Deadline: September 28

  2. Bag-of-n-grams and fast document classification

    1. TA in charge: Meihao Chen

    2. Deadline: October 12

  3. Feedforward language modelling

    1. TA in charge: Tian Wang

    2. Deadline: October 26

  4. Character-level recurrent language modelling

    1. TA in charge: Meihao Chen

    2. Deadline: November 16


For each lab assignment, a student is expected to hand in a short report outlining the model, its implementation and experimental results (up to 3 pages long). Note that the office hours of the lecturer are not meant for assisting students on these assignments.

Final Project

In this course, a student is expected to conduct a research project related to the topics presented during the lectures. The topic of each research project is to be agreed upon with the lecturer and teaching assistants based on the topic proposal submitted by a student. The deadline for the topic proposal is October 16, and the proposal should consist of up to 4 pages of the description of the topic, method and experimental procedures. Once the proposal has been submitted, the student will receive a confirmation and feedback by email from the lecturer and/or teaching assistants in two weeks. The proposal must be submitted by email to TA Meihao Chen.


The final report is due on 19 December. The final report should include the description of the task, models, experiments and conclusion and be up to 6 pages long excluding unlimited pages reserved for references (a more specific instruction on the format will be announced later.) The final report must be submitted by email to TA Tian Wang.


The deadlines for both proposal and final report will not be extended.


Students are encouraged but are not required to form a team of up to two members. Each team must be formed by September 21, and the list of members must be submitted to the TA’s according to the instruction given during the first three lab sessions. Any submission, including the topic proposal as well as the final report, must state clearly the contribution of each member. The failure to include it will result in a lower grade.

Topics

Students are encouraged to choose a topic from the following candidate topics. If there is a clear and compelling reason, students may choose to work with another topic upon the approval of the lecturer.


Students are encouraged to find recent literatures on one of these topics and prepare to discuss it with the lecturer and/or teaching assistants, in order to narrow down a specific topic. Students are encouraged and expected to use the lab sessions and office hours to ask the questions on practical issues implementing these models and running experiments.

Generic Topics

A student or a team of two students may choose any of the following topics for their final project.


  1. Machine translation [See Ch. 6 in the lecture note]

    1. Goal: Comparison of different paradigms of machine translation

    2. Models: phrase-based translation system (Moses), neural machine translation system (dl4mt or nematus)

    3. Data: More than two language pairs from TED or One language pair from WMT’16

  2. Machine Comprehension [Hermann et al., 2015]

    1. Goal: Implementing a question answering system with neural networks

    2. Models: implement two different approaches

    3. Data: CNN Dataset from Google DeepMind (http://cs.nyu.edu/~kcho/DMQA/) or TTIC Who did What [Onishi et al., 2016]

  3. Visual Question-Answering [Antol et al., 2015; Zhou et al., 2016]

    1. Goal: Implementing a visual question-answering system with neural networks

    2. Models: implement two different approaches

    3. Data: VQA from Microsoft (http://www.visualqa.org/)

Special Topics

Each of the following topics may only be taken by one student or one team of two students. Any student who wants to work on one of the following topics needs to come talk to me as soon as possible.


  1. Learning the Natural Language of Black Holes

    1. Mentors: Dr. Daniela Huppenkothen (NYU) and Dr. Victoria Grinberg (MIT)

    2. Description: Click here

  2. Multi-turn Dialogue based Q&A Data Collection Framework

    1. Mensors: Prof. Kyunghyun Cho

    2. Description: Click here

Remarks

A student in this course is expected to act professionally. Please also follow the GSAS regulations on academic integrity found here http://gsas.nyu.edu/page/academic.integrity

Comments