Marius Mosbach

I am a last year PhD student at the Language Science and Technology Department of Saarland University. I am also a member of the Saarbrücken Graduate School of Computer Science and the SFB 1102: Information Density and Linguistic Encoding. My advisor is Dietrich Klakow.

My research is concerned with representation learning and machine learning in the field of natural language processing (NLP). I’m interested in providing a better understanding of machine learning applied to NLP, particularly in the context of pre-trained and fine-tuned neural language models.

service

Building C7.1 Room 0.11

Saarland University

66123 Saarbrücken

news

Oct 21, 2022 Our paper Adapting Pre-trained Language Models to African Languages via Multilingual Adaptive Fine-Tuning won a best paper award :trophy: at COLING 2022.
Jul 01, 2022 I gave invited talks about Unveiling Mysteries of Language Model Fine-tuning at the Technion, Bar-Ilan University, Tel Aviv University, and IBM Tel Aviv.
Jun 01, 2022 I will stay in Tel Aviv from June 7th to June 23rd for an academic visit.
Jun 01, 2022 Our new paper Multilingual Language Model Adaptive Fine-Tuning: A Study on African Languages is now available on arXiv. You can dowload our adapted models on the huggingface model hub. This work was lead by Jesujoba and David.
Oct 01, 2021 Our survey paper Artefact Retrieval: Overview of NLP Models with Knowledge Base Access got accpted to the CSKB workshop @ AKBC 2021. This work was lead by Vilém Zouhar.