
Learning in Dissimilarity Space
Explore the concept of metric learning in dissimilarity space, focusing on representation, feature representation, Aristotle's categories, dissimilarity space transformation, and learning from observations. Discover how dissimilarity measures can be improved through studying examples.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Metric Learning in Dissimilarity Space Robert P.W. Duin1, Manuele Bicego2, Mauricio Orozco-Alzate3, Sang-Woon Kim4 Marco Loog1 1 TU Delft, Netherlands 2 Univ. of Verona, Italy 3 National Univ. Colombia, Manizales 4 Myongji University, Yongin, South Korea 1 Metric learning in dissimilarity space 21 August 2014
The PR problem: representation Generalization Representation Sensor A A Find a space in which - objects can be compared - new objects can be mapped - regions can be defined B B 2 Metric learning in dissimilarity space 21 August 2014
Feature representation We don t know them They reduce class overlap Classes overlap A A Objects points in a Euclidean Space Features reduce classes overlap to be solved by statistics area B B perimeter 3 Metric learning in dissimilarity space 21 August 2014
Aristotle and the categories Matter (potential) Class label Substance The ten categories Shape Dissimilarities Quantity Qualification Relative Where When Being-in-a-position Having Doing Being-affected Attached attributes Features 4 Metric learning in dissimilarity space 21 August 2014
Dissimilarity Space r1 r3 r2 Dissimilarities d d d d d d d 11 12 13 14 15 16 17 d Dissimilarity Space d d d d d d B 21 22 23 24 25 26 27 d d d d d d d A 31 32 33 34 35 36 37 r2(d4) = D d d d d d d d T 41 42 43 44 45 46 47 d d d d d d d 51 52 53 54 55 56 57 d d d d d d d 61 62 d 63 d 64 d 65 d 66 d 67 d ( d d r1(d1) 71 72 73 d x3 74 d x4 75 d x5 76 d x6 77 ) x= d d x1 d x2 r3(d7) x7 S D Objects Original Dissimilarities Vector Space S D Transformed Dissimilarities Vector Space 5 Metric learning in dissimilarity space 21 August 2014
Learning from observations knowledge + observations more (better) knowledge + dis_measure + training set --> better dis_measure Can a given dissimilarity measure be improved by studying a set of examples? 6 Metric learning in dissimilarity space 21 August 2014
Chickenpieces example 446 binary blobs in 5 classes 44 weighted edit dissimilarity measures 44 dissimilarity matrices 446x446 S Given Space dis compared 1-NN error, 2-fold Xval, 25 reps Sclassif. error in dissimilarity space Dclassif. error on given dissimilarities Metric learning in dissimilarity space D 7 21 August 2014
Three proposals Transformations: Locally Adaptive LANN: Nearest Neighbor distances* Non-Linear Scaling NL Scale: ES Lp: Lp distances in EigenSpace ED = D 8 Metric learning in dissimilarity space 21 August 2014 *Wang, J., Neskovic, P., Cooper, L.N.: Improving nearest neighbor rule with a simple adaptive distance measure. Pattern Recognition Letters 28(2), 207 213 (2007)
LANN: Locally Adaptive Nearest Neighbor distance Scale distances to objects by their distance to the 1NN of a different class 44 Chickenpieces Dis. Measures S LANN: 0.5 LANN Dissim D 0 0.5 LANN Dis. Space 0 0 0.5 0 Orig. Dis. Space 0.5 Orig. Dissim 9 Metric learning in dissimilarity space 21 August 2014 *Wang, J., Neskovic, P., Cooper, L.N.: Improving nearest neighbor rule with a simple adaptive distance measure. Pattern Recognition Letters 28(2), 207 213 (2007)
NL Scale: Non-Linear Scaling Scale distances by raising to some optimized power 44 Chickenpieces Dis. Measures S NL Scale: 0.5 NLScale Dissim Protein P=1 1 D Cum Eigenvalue Fraction 0.8 0 0.5 0.6 P=0.01 NLScale Dis. Space 0.4 0.2 0 0 0 0.5 0 Orig. Dis. Space 0.5 0 50 100 # Eigenvectors 150 200 250 Orig. Dissim 10 Metric learning in dissimilarity space 21 August 2014
ESLP: Lp distances in Eigenspace Compute Lp distances in the on eigenvectors rotated dissimilarity space ES Lp: 44 Chickenpieces Dis. Measures ED = D 0.5 ||E?(?) E?(??) ||p d ?,?? ESLP Dissim ? 0 S 0.5 ESLP Dis. Space D 0 0 0.5 0 Orig. Dis. Space 0.5 Orig. Dissim 11 Metric learning in dissimilarity space 21 August 2014
Datasets 12 16 June 2014
25 times 2-fold X-Vall error x 1000 (1-NN Rule) Dissimilarity space Dissimilarities
Comparison with distances 25 times 2-fold X-Vall error x 1000 (1-NN Rule) Reference Dissimilarities Dissimilarity space
Comparison with dis-space 25 times 2-fold X-Vall error x 1000 (1-NN Rule) Reference Dissimilarities Dissimilarity space
Conclusions Transformations: LANN: better dissimilarities d ?,?? d ?,??? NL Scale: better dissimilarity spaces ES Lp: ED = D May be good for large datasets ||E?(?) E?(??) ||p d ?,?? 16 Metric learning in dissimilarity space 21 August 2014
Why and when does this work? Given dissimilarity measures are not always the best ones possible Sufficiently large datasets may be used to learn better ones. The larger the dataset, the more local learning should be. PolyDisH57 NL Scale PolyDisH57 3 0.5 10 1NN-Error Power NL Scale Original Intrinsic Dimensionality 0.4 2 10 Power Error 0.3 0.2 1 10 0.1 0 0 10 1 2 3 10 10 10 1 2 3 10 10 10 Training set size per class Training set size per class