Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Properly-Weighted Graph Laplacian for Semi-supervised Learning

Properly-Weighted Graph Laplacian for Semi-supervised Learning The performance of traditional graph Laplacian methods for semi-supervised learning degrades substantially as the ratio of labeled to unlabeled data decreases, due to a degeneracy in the graph Laplacian. Several approaches have been proposed recently to address this, however we show that some of them remain ill-posed in the large- data limit. In this paper, we show a way to correctly set the weights in Laplacian regularization so that the estimator remains well posed and stable in the large-sample limit. We prove that our semi-supervised learning algorithm converges, in the infinite sample size limit, to the smooth solution of a continuum variational problem that attains the labeled values continuously. Our method is fast and easy to implement. Keywords Semi-supervised learning · Label propagation · Asymptotic consistency · PDEs on graphs · Gamma-convergence Mathematics Subject Classification 49J55 · 35J20 · 35B65 · 62G20 · 65N12 1 Introduction For many applications of machine learning, such as medical image classification and speech recognition, labeling data requires human input and is expensive [13], while unlabeled data is relatively cheap. Semi-supervised learning aims to exploit this dichotomy by utilizing the geometric or topological properties of the unlabeled data, in conjunction with the labeled data, http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Applied Mathematics and Optimization Springer Journals

Properly-Weighted Graph Laplacian for Semi-supervised Learning

Applied Mathematics and Optimization , Volume OnlineFirst – Dec 7, 2019

Loading next page...
 
/lp/springer-journals/properly-weighted-graph-laplacian-for-semi-supervised-learning-tikMRJFsF5

References (77)

Publisher
Springer Journals
Copyright
Copyright © 2019 by Springer Science+Business Media, LLC, part of Springer Nature
Subject
Mathematics; Calculus of Variations and Optimal Control; Optimization; Systems Theory, Control; Theoretical, Mathematical and Computational Physics; Mathematical Methods in Physics; Numerical and Computational Physics, Simulation
ISSN
0095-4616
eISSN
1432-0606
DOI
10.1007/s00245-019-09637-3
Publisher site
See Article on Publisher Site

Abstract

The performance of traditional graph Laplacian methods for semi-supervised learning degrades substantially as the ratio of labeled to unlabeled data decreases, due to a degeneracy in the graph Laplacian. Several approaches have been proposed recently to address this, however we show that some of them remain ill-posed in the large- data limit. In this paper, we show a way to correctly set the weights in Laplacian regularization so that the estimator remains well posed and stable in the large-sample limit. We prove that our semi-supervised learning algorithm converges, in the infinite sample size limit, to the smooth solution of a continuum variational problem that attains the labeled values continuously. Our method is fast and easy to implement. Keywords Semi-supervised learning · Label propagation · Asymptotic consistency · PDEs on graphs · Gamma-convergence Mathematics Subject Classification 49J55 · 35J20 · 35B65 · 62G20 · 65N12 1 Introduction For many applications of machine learning, such as medical image classification and speech recognition, labeling data requires human input and is expensive [13], while unlabeled data is relatively cheap. Semi-supervised learning aims to exploit this dichotomy by utilizing the geometric or topological properties of the unlabeled data, in conjunction with the labeled data,

Journal

Applied Mathematics and OptimizationSpringer Journals

Published: Dec 7, 2019

There are no references for this article.