Pretrained Models

Pretrained Speech Encoders and Efficient Fine-tuning Methods for Speech Translation: UPC at IWSLT 2022

Our submission to the IWSLT 2022 shared task details an end-to-end speech translation system built on large pretrained models. We leverage efficient fine-tuning techniques like …

Ioannis Tsiamas
Read more

The TALP-UPC Participation in WMT21 News Translation Task: an mBART-based NMT Approach

Our submission to the WMT 2021 news translation shared task for German-French translation demonstrates that fine-tuning a pretrained mBART50 model significantly outperforms a …

Carlos Escolano
Read more

End-to-End Speech Translation with Pre-trained Models and Adapters: UPC at IWSLT 2021

Our submission to the IWSLT 2021 shared task details an end-to-end speech translation system combining large pretrained models with adapters for efficient fine-tuning. By training …

Gerard I. Gállego
Read more