Ayuda
Ir al contenido

Dialnet


Sparse RKHS estimation via globally convex optimization and its application in LPV-IO identification

  • Autores: Vincent Laurain, Roland Tóth, Dario Piga, Mohamed Abdelmonim Hassan Darwish
  • Localización: Automatica: A journal of IFAC the International Federation of Automatic Control, ISSN 0005-1098, Nº. 115, 2020
  • Idioma: inglés
  • Texto completo no disponible (Saber más ...)
  • Resumen
    • Function estimation using the Reproducing Kernel Hilbert Space (RKHS) framework is a powerful tool for identification of a general class of nonlinear dynamical systems without requiring much a priori information on model orders and nonlinearities involved. However, the high degrees-of-freedom (DOFs) of RKHS estimators has its price, as in case of large scale function estimation problems, they often require a serious amount of data samples to explore the search space adequately for providing high-performance model estimates. In cases where nonlinear dynamic relations can be expressed as a sum of functions, the literature proposes solutions to this issue by enforcing sparsity for adequate restriction of the DOFs of the estimator, resulting in parsimonious model estimates. Unfortunately, all existing solutions are based on greedy approaches, leading to optimization schemes which cannot guarantee convergence to the global optimum. In this paper, we propose an ℓ1-regularized non-parametric RKHS estimator which is the solution of a quadratic optimization problem. Effectiveness of the scheme is demonstrated on the non-parametric identification problem of LPV-IO models where the method solves simultaneously (i) the model order selection problem (in terms of number of input–output lags and input delay in the model structure) and (ii) determining the unknown functional dependency of the model coefficients on the scheduling variable directly from data. The paper also provides an extensive simulation study to illustrate effectiveness of the proposed scheme.


Fundación Dialnet

Dialnet Plus

  • Más información sobre Dialnet Plus

Opciones de compartir

Opciones de entorno