Por favor, use este identificador para citar o enlazar este ítem: https://repositorio.uci.cu/jspui/handle/123456789/9471
Título : Accelerated Proximal Gradient Descent in Metric Learning for Kenel Regression
Autor : Gonzalez, Hector
Morell, Carlos
Ferri, Francesc J.
Fecha de publicación : 2018
Editorial : Springer
Citación : Gonzalez H., Morell C., Ferri F.J. (2018) Accelerated Proximal Gradient Descent in Metric Learning for Kernel Regression. In: Hernández Heredia Y., Milián Núñez V., Ruiz Shulcloper J. (eds) Progress in Artificial Intelligence and Pattern Recognition. IWAIPR 2018. Lecture Notes in Computer Science, vol 11047. Springer, Cham. https://doi.org/10.1007/978-3-030-01132-1_25
Resumen : The purpose of this paper is to learn a specific distance function for the Nadayara Watson estimator to be applied as a non-linear classifier. The idea of transforming the predictor variables and learning a kernel function based on Mahalanobis pseudo distance througth an low rank structure in the distance function will help us to lead the development of this problem. In context of metric learning for kernel regression, we introduce an Accelerated Proximal Gradient to solve the non-convex optimization problem with better convergence rate than gradient descent. An extensive experiment and the corresponding discussion tries to show that our strategie its a competitive solution in relation to previously proposed approaches. Preliminary results suggest that this line of work can deal with others regularization approach in order to improve the kernel regression problem.
URI : https://repositorio.uci.cu/jspui/handle/123456789/9471
Aparece en las colecciones: Eventos

Ficheros en este ítem:
Fichero Tamaño Formato  
A025.pdf116.35 kBAdobe PDFVisualizar/Abrir

Los ítems del Repositorio están protegidos por copyright, con todos los derechos reservados, a menos que se indique lo contrario.