NDAK19001U Advanced Topics in Natural Language Processing (ATNLP)

Volume 2021/2022
Content

The purpose of this course is to expose students to selected advanced topics in natural language processing. The course will bring the students up to a level sufficient for writing their master thesis in this area. The course is relevant for computer science students, as well as students from other studies with a good mathematical background, and students in the IT & Cognition programme. Please refer to the recommended academic qualifications.

 

Examples of topics include:

  • Natural language understanding

  • Representation learning

  • Multitask learning

  • Learning from multiple modalities

  • Deep generative models

  • Reinforcement learning

  • Generative adversarial learning

 

* The exact list of topics in the current year will depend on the lecturers and trends in natural language processing research and will be announced on the course Absalon website. Feel free to contact the course organiser for details.

Learning Outcome

Knowledge of

  • Selected advanced topics in natural language processing, including:

    • design of learning algorithms

    • evaluation of learning algorithms

Skills to

  • Read and understand recent scientific literature in the field of natural language processing

  • Apply the knowledge obtained by reading scientific papers

  • Compare methods and assess their potentials and shortcomings

Competences to

  • Understand advanced methods, and to transfer the gained knowledge to solutions to practical problems

  • Plan and carry out self-learning

See Absalon.

The course requires a strong mathematical background and an understanding of natural language processing. It is suitable for computer science master students, as well as students from mathematics (statistics, actuarial math, math-economics, etc), and IT & Cognition study programmes, provided the latter have a strong mathematical background. If you are a student from another study programme we strongly advise you to contact the course organiser and verify the suitability of your background prior to signing up for the course.

It is assumed that the students have successfully passed either the “Natural Language Processing” course from KU, the “NLP and Deep Learning” course from ITU, or the “Language Processing 1” and “Language Processing 2” courses from KU. In case you have not passed one of these courses, please contact the course organiser to verify the suitability of your background prior to signing up for the course.

Academic qualifications equivalent to a BSc degree is recommended.
Lectures and class instructions, student presentations, peer feedback, group projects, project presentations
  • Category
  • Hours
  • Lectures
  • 28
  • Preparation
  • 60
  • Practical exercises
  • 33
  • Project work
  • 85
  • Total
  • 206
Oral
Individual
Collective
Continuous feedback during the course of the semester
Peer feedback (Students give each other feedback)
Credit
7,5 ECTS
Type of assessment
Continuous assessment
The assessment is based on the following five parts:

1. Completion of weekly quizzes.

2. Class presentation of an academic paper.

3. Peer feedback on the presentation(s) of the class presentations of other students.

4. Group presentation of the group project to re-implement a model from the literature.

5. An individual 5-page written report on student's efforts to replicate the model and the results of their replication.

Each part-exam is assessed and weighted individually, and the final grade is determined based on this.
Aid
All aids allowed
Marking scale
7-point grading scale
Censorship form
No external censorship
Exam period

Several internal examiners.

Re-exam

The re-exam consists of two parts:

1. An individual written report. The written report is to be submitted no later than 3 weeks before the re-exam week.

2. A 30 minute oral examination without preparation.

The two parts will be given an overall assessment

Criteria for exam assesment

See learning outcome