Ayuda
Ir al contenido

Dialnet


Cross-lingual Transfer Learning and Multitask Learning for Capturing Multiword Expressions

    1. [1] University of Wolverhampton

      University of Wolverhampton

      GB.ENG.Q3.41UF, Reino Unido

  • Localización: Joint Workshop on Multiword Expressions and WordNet (MWE-WN 2019): August 2, 2019 Florence, Italy: Proceedings of the Workshop / Agata Savary (ed. lit.), Carla Parra Escartín (ed. lit.), Francis Bond (ed. lit.), Jelena Mitrovic (ed. lit.), Verginica Barbu Mititelu (ed. lit.), 2019, ISBN 978-1-950737-26-0, págs. 155-161
  • Idioma: inglés
  • Enlaces
  • Resumen
    • Recent developments in deep learning have prompted a surge of interest in the application of multitask and transfer learning to NLP problems. In this study, we explore for the first time, the application of transfer learning (TRL) and multitask learning (MTL) to the identification of Multiword Expressions (MWEs). For MTL, we exploit the shared syntactic information between MWE and dependency parsing models to jointly train a single model on both tasks. We specifically predict two types of labels: MWE and dependency parse. Our neural MTL architecture utilises the supervision of dependency parsing in lower layers and predicts MWE tags in upper layers. In the TRL scenario, we overcome the scarcity of data by learning a model on a larger MWE dataset and transferring the knowledge to a resource-poor setting in another language. In both scenarios, the resulting models achieved higher performance compared to standard neural approaches.


Fundación Dialnet

Dialnet Plus

  • Más información sobre Dialnet Plus

Opciones de compartir

Opciones de entorno