We investigate the use of a non-parametric independence measure, the
hilbert-schmidt independence criterion (HSIC), as a loss-function for learning
robust regression and classification models. This loss-function
本文提出了基于 HSIC(Hilbert-Schmidt Independence Criterion)的特征选择框架,旨在将各种监督学习问题(包括分类和回归)统一起来。该方法通过后向逐步消除算法进行求解,Maximise features and labels 之间的相关性, 并在人工和实际数据集上展示了该方法的有效性。