Ayuda
Ir al contenido

Dialnet


Under-Actuation Modelling in Robotic Hands via Neural Networks for Sign Language Representation with End-User Validation

    1. [1] Universidad Carlos III de Madrid

      Universidad Carlos III de Madrid

      Madrid, España

  • Localización: Intelligent Data Engineering and Automated Learning – IDEAL 2020. 21st International Conference: Guimarães, Portugal; November 4–6, 2020. Proceedings / Cesar Analide (ed. lit.), Paulo Novais (ed. lit.), David Camacho Fernández (ed. lit.), Hujun Yin (ed. lit.), Vol. 2, 2020 (Part II), ISBN 978-3-030-62365-4, págs. 239-251
  • Idioma: inglés
  • Enlaces
  • Resumen
    • This paper presents a study on under-actuation modelling applied to robotic hands aimed at sign language representation. Prior studies using a simulated TEO humanoid robot for representing sign language have shown positive comprehension and satisfaction responses among the deaf and hearing impaired community. The under-actuated mechanics of the robotic fingers were not contemplated in the simulated model, thus the correspondence problem arises as the previous joint space positions cannot be directly sent to the physical system. In addition to the 3:1 and 2:1 ratio of the under-actuation of the finger mechanisms, tendons and springs involve stiffness and elasticity that are difficult or unfeasible to model, and justify the need for a data-driven approach. Three motor command generators using three different neural network models are analysed and evaluated. Two of the generators are trained in a supervised fashion, and the third involves variational self-supervision and a transformation upon the latent space. The simulated joint space positions are translated into motor commands for the physical embodied robot to represent a sign language dactylology, which is in turn evaluated by deaf and hearing impaired end-users.


Fundación Dialnet

Dialnet Plus

  • Más información sobre Dialnet Plus

Opciones de compartir

Opciones de entorno