Ayuda
Ir al contenido

Dialnet


Exploring the impact of word embeddings for disjoint semisupervised Spanish verb sense disambiguation

  • Autores: Cristian Cardellino, Laura Alonso Alemany
  • Localización: Inteligencia artificial: Revista Iberoamericana de Inteligencia Artificial, ISSN-e 1988-3064, ISSN 1137-3601, Vol. 21, Nº. 61, 2018 (Ejemplar dedicado a: Inteligencia Artificial (June 2018)), págs. 67-81
  • Idioma: inglés
  • Enlaces
  • Resumen
    • This work explores the use of word embeddings as features for Spanish  verb sense disambiguation (VSD). This type of learning technique is named disjoint semisupervised learning: an unsupervised algorithm (i.e. the word embeddings) is trained on unlabeled data separately as a first step, and then its results are used by a supervised classifier. In this work we primarily focus on two aspects of VSD trained with unsupervised word representations. First, we show how the domain where the word embeddings are trained affects the performance of the supervised task. A specific domain can improve the results if this domain is shared with the domain of the supervised task, even if the word embeddings are trained with smaller corpora. Second, we show that the use of word embeddings can help the model generalize when compared to not using word embeddings. This means embeddings help by decreasing the model tendency to overfit.


Fundación Dialnet

Dialnet Plus

  • Más información sobre Dialnet Plus

Opciones de compartir

Opciones de entorno