Ayuda
Ir al contenido

Dialnet


Resumen de Automatic Extraction of Biometric Descriptors Based on Gait

Rubén Delgado Escaño

  • Nowadays, people identification is a topic of interest due to its implications in terms of safety, service automation and sanitary control. Historically, people have been identified by using its face, iris, or fingerprints. However, those kinds of systems require the collaboration of the subject to be identified, which implies a problem in some scenarios where collaboration is impossible.

    Due to this, gait recognition is presented as an alternative in the field of people recognition, since it does not require the cooperation of the subject, or even the knowledge that they are being identified. It can be done at a certain distance, and it is a difficult method to deceive or avoid, since a mask, hood or other typical blocking objects would not deceive the recognition system.

    However, the study of gait recognition is not exempt from challenges and problems yet to be solved. This thesis focuses on studying and resolving these points that we believe have not been sufficiently addressed. Firstly, we study the viability of soft-biometric classification in gait recognition, human characteristics such as age and gender. Secondly, we address the problem of missing data in a dataset implementing a cross-dataset model that can jointly use multiple datasets with different subjects, captured with different sensors and characteristics. Thirdly, we have implemented a framework to create synthetical samples with multiples subjects in scene. Fourthly, we propose a solution to the missing modality problem, when one or more of the input modalities are missing. Finally, we use knowledge distillation to reduce the computational complexity of a model and its input data, by teaching a model with grayscale images to mimic the predictors obtained by a model using optical flow.

    In summary, this thesis has explored unconventional aspects of gait-based people identification and proposed new approaches for addressing this challenging problem.


Fundación Dialnet

Dialnet Plus

  • Más información sobre Dialnet Plus