The 2026 episode at Faculty of Mathematics, Physics and Informatics of Comenius University
- Lectures M-XII
Friday, 9:00 - 10:20 (voluntary)
- Labs M-XII
Friday, 10:20 - whenever (voluntary)
Course description#
This course tries to go deeper into how we can represent human language (say English or Slovak) in a way that can be processed by computational systems (a.k.a. computer programs), and how this representation can then be used to do interesting things, such as
- translation
- question answering
- grammatical error correction
- summarization
- text (like poems or song lyrics) generation
- and much more…
All of this combined is part of a field called Natural Language Processing, which ended up being the name of the course.
Feel free to check out the previous year’s class webpage as well!
Lectures#
Lecture I: The NLP Wohoo#
What can language models do – and how did we get here?
- Discussed material:
- Live demos: translation, question answering, reasoning, code generation, multimodal understanding, agents
- A whirlwind tour of NLP: from ELIZA (1966) to GPT-5 (2026)
- What this course will cover and how it all fits together
- Supplementary resources:
- Eliza Bot Demo: one of the most famous early NLP systems. Try having a conversation – you may be surprised.
- Intro to Large Language Models by Andrej Karpathy: probably the best one-hour introduction to the current state of LLMs.
- LMSYS Chatbot Arena Leaderboard: as many would say, “the only benchmark worth taking a look at”.
More lectures coming soon…
Resources#
Introduction to Natural Language Processing by Jacob Eisenstein
Speech and Language Processing, 3rd Edition by Daniel Jurafsky, James H Martin
A Primer on Neural Network Models for Natural Language Processing by Yoav Goldberg
Neural Network Methods for Natural Language Processing by Yoav Goldberg
Similar Courses Elsewhere#
There are more than a few similar (and often times even better) courses out there. Here is a sample:
- Computational Linguistics @ UPenn
- Natural Language Processing @ CMU
- CS 224N: NLP with Deep Learning @ Stanford
- Advanced NLP (11-711) @ CMU
- Quantitative Methods for NLP (6.8610) @ MIT
- Large Language Models @ ETH Zurich
Grading#
| Component | Weight |
|---|---|
| Assignments | 50% |
| Project | 50% |
Assignments are available via Google Classroom (the class code is hcexzhmi -- feel free to use the following invite link ) but they are also available in the following repository on GitHub: https://github.com/NaiveNeuron/nlp-exercises
Check out the Project Ideas for 2026!
| Points | Grade |
|---|---|
| (90, inf] | A |
| (80, 90] | B |
| (70, 80] | C |
| (60, 70] | D |
| (50, 60] | E |
| [0, 50) | FX |