Ayuda
Ir al contenido

Dialnet


Automatic Translation of Sentences to Mexican Sign Language: Rule-based Machine Translation and Animation Synthesis in Avatar

  • Autores: Bella Martínez-Seis, Obdulia Pichardo Lagunas, Eliot Hernández-Morales, Óscar Rivera-Rodríguez, Sabino Miranda
  • Localización: Computación y Sistemas (CyS), ISSN 1405-5546, ISSN-e 2007-9737, Vol. 29, Nº. 1, 2025, págs. 145-155
  • Idioma: inglés
  • Enlaces
  • Resumen
    • Abstract: Sign Languages are mainly used by deaf people. The translation between Spanish of Mexico and Mexican Sign Language is a current challenge that remains unresolved. This paper considers two main areas for a proper translation: automatic translation and sign representation. The first one considers the syntactic of the language. The second one includes the representation of sequential signs. We propose a tool to translate sentences from written Spanish to Mexican Sign Language considering the syntactic from both languages. We use automatic translation based on rules because of the lack of a big corpus. The BLUE score for the translation was about 0.8061, which suggests a good translation. To display the signs, we used a 3D humanoid avatar. Signs Languages are agraphia, so we use a configuration matrix to describe them. We propose a process for Sign Language Synthesis. It takes the configuration matrix of each sign and generates animation rules describing the whole movement and positions that the avatar follows to produce the signs. It allows to increase the signs that the avatar represents easily.

Los metadatos del artículo han sido obtenidos de SciELO México

Fundación Dialnet

Dialnet Plus

  • Más información sobre Dialnet Plus

Opciones de compartir

Opciones de entorno