NDAK19001U Advanced Topics in Natural Language Processing (ATNLP)

Volume 2019/2020
Content

The purpose of this course is to expose students to selected advanced topics in natural language processing. The course will bring the students up to a level sufficient for writing their master thesis in this area. The course is relevant for computer science students, as well as students from other studies with good mathematical background, and students in the IT & Cognition programme.  Please refer to the recommended academic qualifications.

 

Examples of topics include:

  • Natural language understanding

  • Representation learning

  • Multitask learning

  • Learning from multiple modalities

  • Deep generative models

  • Reinforcement learning

  • Generative adversarial learning

 

* The exact list of topics in the current year will depend on the lecturers and trends in natural language processing research and will be announced on the course Absalon website. Feel free to contact the course organiser for details.

Learning Outcome

Knowledge of

  • Selected advanced topics in natural language processing, including:

    • design of learning algorithms

    • evaluation of learning algorithms

Skills to

  • Read and understand recent scientific literature in the field of natural language processing

  • Apply the knowledge obtained by reading scientific papers

  • Compare methods and assess their potentials and shortcomings

Competences to

  • Understand advanced methods, and to transfer the gained knowledge to solutions to practical problems

  • Plan and carry out self-learning

See Absalon.

The course requires strong mathematical background and an understanding of natural language processing. It is suitable for computer science master students, as well as students from mathematics (statistics, actuarial math, math-economics, etc), and IT & Cognition study programmes, provided the latter have a strong mathematical background. Students from other study programmes are strongly advised to contact the course organiser and verify suitability of their background prior to signing up for the course.

It is assumed that the students have successfully passed either the “Natural Language Processing” course from KU, the “NLP and Deep Learning” course from ITU, or the “Language Processing 1” and “Language Processing 2” courses from KU. In case you have not passed one of these courses, please contact the course organiser to verify suitability of their background prior to signing up for the course.

Academic qualifications equivalent to a BSc degree is recommended.
Lectures and class instructions, student presentations, peer feedback, group projects, project presentations
  • Category
  • Hours
  • Lectures
  • 28
  • Practical exercises
  • 33
  • Preparation
  • 60
  • Project work
  • 85
  • Total
  • 206
Oral
Continuous feedback during the course of the semester
Peer feedback (Students give each other feedback)
Credit
7,5 ECTS
Type of assessment
Continuous assessment
The grade will be based on three elements:

1. Class presentation of an academic paper

2. Presentation of student's progress on their project to re-implement a model from the literature.

3. Four-page written report on student's efforts to replicate the model and the results of their replication.

These parts are weighted at a ratio 40%, 20%, 40%, respectively. An overall assessment is applied across all three parts of the exam.
Aid
All aids allowed
Marking scale
7-point grading scale
Censorship form
No external censorship
Exam period

Several internal examiners.

Re-exam

The re-exam consists of two parts. The re-exam consists of an individual written report, together with a 30 minute oral examination without preparation. The two parts will be given an overall assessment. The written report is to be handed in no later than 3 weeks before the re-exam week.

Criteria for exam assesment

See learning outcome