Semi supervised learning

50 %
50 %
Information about Semi supervised learning
Education

Published on March 4, 2014

Author: ahmdalitaha

Source: slideshare.net

Description

This presentation gives an overview about semi-supervised learning methods (Least square solution, Eigen vectors and Eigen functions). It points to some of the applications these methods can be used like object categorization and Interactive Image segmentation

Semi-Supervised Learning Ahmed Taha Feb 2014 S

Content S Concept Introduction S Graph cut and Least Square solution S Eigen vector and Eigen Functions S Application

Concept Introduction S Graph Cut S Divide Graph Into two divisions S Lowest Cut cost ?

Concept Introduction S Degree Matrix / variation

Concept Introduction S Object Representation S 2D Point S 3D Point S Pixel S Or even a Whole Image S It is ALL nodes and Edges

Concept Introduction Semi-supervised learning vs. Un-supervised learning S Un-supervised Learning (No Labeled Data)

Concept Introduction Semi-supervised learning vs. Un-supervised learning S Semi-Supervised Learning (Labeled Data and structure of unlabeled Data)

Graph Cut Least square Solution S Semi-Supervised Learning (Labeled Data) S We have 3 Objects Now This Should be a Fully Connected Graph

Graph Cut Least square Solution S Objective separate graph into two part S (Red and Non-Red) S Size if this Matrix is S N^2 S Not sparse This Should be a Fully Connected Graph

Graph Cut Least square Solution S We can after that divide the rest of graph into blue and not blue and so on S NP Problem ? This Should be a Fully Connected Graph

Graph Cut Least square Solution S Current Situation , we have a fully connected Graph , represented in NxN Matrix = W (Similarity Matrix) S We expect each object to be assigned {1,-1} {Red, non- red} with lowest cost assignment cost S But this is NP ???

Label Propagation Least square Solution S Weighted Average concept S New Node S (Red Now)1 * 1 S (Blue) -1 * 0.1 S Green -1 * 0.2 S 1-0.1-0.2 = 0.7 , S so it is probably a Red Object {1} 1 0.1 0.2

Label Propagation Least square Solution S Here comes the first Equation , Lets define S Matrix W (NxN) , Similarity Between Objects S Matrix D (NxN), degree of each Object S Matrix L (Laplacian Matrix) = D – W S Label vector F (Nx1), assignment of each object [-1,1] and not {-1,1} S Objective Function  Min ½

Least square Solution S Objective Function  Min ½ S But this doesn't’t consider Label data yet S After some Equation manipulation

Least square Solution S We need to solve NxN S NxN matrix Inverse S NxN matrix multiplication S Need to reduce dimensions by using Eigenvectors of Graph Laplacian

EigenVector S As mentioned before we want to have a Label vector f S f = α U , so once we have U, we can get α and then we get f S Laplacian Eigenmap dimension reduction S L have the characteristic is this Graph 1 0.1 S Mapping the objects into a new dimension 0.2

EigenVector S As mentioned before we want to have a Label vector f S Get the EigenVectors (U) of Laplacian Matrix (L) S f = α U , so once we have U, we can get α and then we get f S We still need to work with NxN Matrix, at least we compute its Eigen vectors

Eigen Function S Eigenfunction are limit of Eigenvectors as n  ∞ S For each dimension (2), S we calculate the Eigenvector by interpolating the Eigen function from the histogram of this dimension S Which takes a lot less than S Need more explanation 

Eigen Function S Eigenfunction are limit of Eigenvectors as n  ∞ S Notice solution of Eigenfunction is based on the number of Dimensions, while Eigenfunction is based on number of Objects S Images Pixels as Object S Images with local features as dimension

Application S Object Classification S Interactive Image segmentation S Image Segmentation

Application Object Classification S Coil 20 Dataset S 20 Different Object S Each Object has 72 different pose

Application Object Classification S Our Experiment S Label some of these Images S Both Positive and Negative Labels S Use the LSQ , EigenVector, EigenFunction to compute the labels of the Unlabled data

Application Object Classification S Our Results LSQ Solution EigenVector Solution Eigenfunction Solution

Application Object Classification S Results Analysis S LSQ solution is almost perfect since it is almost exact Solution S EigenVector generate approximate solution but in less time, which makes more sense it is just solving one NxN Matrix to get Eigen Vectors S Eigen Function method also generated an approximate solution but its time was worse

Application Object Classification S Time Results Analysis

Application Object Classification S Results Explanation S We have 4 (Object) * 36 (pose per object) so total of 144 Object so Matrix laplaican is of size 144 S Each Image has 128*128 (gray scale) pixel so total of 16384 , so each object have 16384 dimension S 144 Object vs 16384 dimension

Application Object Classification S Results Explanation S 144 Object vs 16384 dimension S So it is expected that LSQ , EigenVector method to finish faster since Matrix L is not that big S While Eigen-function will take a long time to compute the Eigen-function for each dimension 16384

Application Interactive Image Segmentation S Why it is called Interactive

Application Interactive Image Segmentation S We now have 500x320 = 160000 Object S But each object have like 5 dimension (R,G,B,X,Y). S Eigen vectors vs Eigen functions ?

Application Interactive Image Segmentation S Eigen vectors vs Eigen functions ?

Application Interactive Image Segmentation S Eigen vectors vs Eigen functions ? S No Way LSQ or Eigen vectors can support such number of objects , Laplacian Matrix size is 160000 x 160000. S Eigen function method calculates Eigen vectors of 5 dimensions and we are ready to show some results

Application Interactive Image Segmentation

Application Interactive Image Segmentation S Notice why bears have distinctive colors from the rest of the image, but this is still in progress work. S It is not perfect yet

Application S Non-Interactive segmentation S Fore-ground background segmentation S Co-segmentation S System held user to know where to add annotation

Conclusion S It is better to use Eigen vectors when you have small set of objects with high dimension S It is better to use Eigen functions when you have big set of objects with small dimension

Thanks 

Add a comment

Related presentations

Related pages

Semi-supervised learning - Wikipedia, the free encyclopedia

Semi-supervised learning is a class of supervised learning tasks and techniques that also make use of unlabeled data for training – typically a small ...
Read more

Semi-Supervised Learning Literature Survey

Semi-Supervised Learning Literature Survey Xiaojin Zhu Computer Sciences TR 1530 University of Wisconsin – Madison Last modified on July 19, 2008
Read more

Tutorial on Semi-Supervised Learning

Tutorial on Semi-Supervised Learning Xiaojin Zhu Department of Computer Sciences University of Wisconsin, Madison, USA Theory and Practice of Computational ...
Read more

Semi-Supervised Learning | The MIT Press

In the field of machine learning, semi-supervised learning (SSL) occupies the middle ground, between supervised learning (in which all training examples ...
Read more

Introduction to Semi-Supervised Learning

Abstract. Semi-supervised learning is a learning paradigm concerned with the study of how computers and natural systems such as humans learn in the ...
Read more

Semi-Supervised Learning of Mixture Models and Bayesian ...

Semi-Supervised Learning of Mixture Models and Bayesian Networks Fabio Gagliardi Cozman FGCOZMAN@USP.BR Escola Politecnica,´ Univ. of Sao˜ Paulo, Av ...
Read more

Semi-supervised Sequence Learning

Semi-supervised Sequence Learning Andrew M. Dai Google Inc. adai@google.com Quoc V. Le Google Inc. qvl@google.com Abstract We present two approaches to use ...
Read more

Supervised learning - Wikipedia, the free encyclopedia

Supervised learning is the machine learning task of inferring a function from labeled training data. The training data consist of a set of training examples.
Read more

1.14. Semi-Supervised — scikit-learn 0.17.1 documentation

1.14. Semi-Supervised¶ Semi-supervised learning is a situation in which in your training data some of the samples are not labeled. The semi-supervised ...
Read more

edited by Olivier Chapelle, Bernhard Schölkopf, and ...

Semi-Supervised Learning Olivier Chapelle Bernhard Sch¨olkopf Alexander Zien The MIT Press Cambridge, Massachusetts London, England
Read more