alexa High-dimensional feature selection by feature-wise kernelized Lasso.
Biomedical Sciences

Biomedical Sciences

International Journal of Biomedical Data Mining

Author(s): Yamada M, Jitkrittum W, Sigal L, Xing EP, Sugiyama M

Abstract Share this page

Abstract The goal of supervised feature selection is to find a subset of input features that are responsible for predicting output values. The least absolute shrinkage and selection operator (Lasso) allows computationally efficient feature selection based on linear dependency between input features and output values. In this letter, we consider a feature-wise kernelized Lasso for capturing nonlinear input-output dependency. We first show that with particular choices of kernel functions, nonredundant features with strong statistical dependence on output values can be found in terms of kernel-based independence measures such as the Hilbert-Schmidt independence criterion. We then show that the globally optimal solution can be efficiently computed; this makes the approach scalable to high-dimensional problems. The effectiveness of the proposed method is demonstrated through feature selection experiments for classification and regression with thousands of features. This article was published in Neural Comput and referenced in International Journal of Biomedical Data Mining

Relevant Expert PPTs

Relevant Speaker PPTs

Recommended Conferences

Peer Reviewed Journals
 
Make the best use of Scientific Research and information from our 700 + peer reviewed, Open Access Journals
International Conferences 2017-18
 
Meet Inspiring Speakers and Experts at our 3000+ Global Annual Meetings

Contact Us

 
© 2008-2017 OMICS International - Open Access Publisher. Best viewed in Mozilla Firefox | Google Chrome | Above IE 7.0 version
adwords