X-ABI: Toward parameter-efficient multilingual adapter-based inference for cross-lingual transfer
Abstract
Natural Language Inference for low-resource languages is challenging due to the unavailability of sufficient training samples for downstream tasks. This work proposes to leverage the large corpus availability of high-resource languages and transfer learning to assist low-resource languages. We propose a method, multilingual adapter-based inference (X-ABI), to interpolate through the language spaces in multilingual language models using adapters in transformers. This results in comparable accuracies on downstream tasks by efficiently training the model on available training samples from mainstream languages. We show that training only ~0.1% of the parameters on the English dataset performs 20.9% better than baseline and is comparable to the state of the art of other languages present on the XNLI dataset with 4 h of training on free Google Colab GPU.KeywordsNatural Language InferenceNatural Language ProcessingTransfer learningCross-lingual transferAdapter-based inference.
@InProceedings{menon2023xabi, author="Menon, Aditya Srinivas and Anand, Konjengbam", title="X-ABI: Toward Parameter-Efficient Multilingual Adapter-Based Inference for Cross-Lingual Transfer", booktitle="Data Management, Analytics and Innovation", year="2023", publisher="Springer Nature Singapore", address="Singapore", pages="303--317", isbn="978-981-99-1414-2" }