
Classification Techniques and Decision Boundaries
Classification techniques such as Nearest Neighbor Classifiers, Support Vector Machines, and Ensemble Methods play a crucial role in learning good decision boundaries to separate different classes in a dataset. Instance-based classifiers utilize stored cases for predicting class labels of unseen cases, with examples like Rote-learner and Nearest Neighbor. Understanding nearest-neighbor classifiers and k-nearest neighbors is essential for effective classification tasks.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
COSC 4335: Other Classification Techniques 1. 2. 3. Nearest Neighbor Classifiers Support Vector Machines Ensemble Methods and Adaboost Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Classification and Decision Boundaries Classification can be viewed as learning good decision boundaries that separate the examples belonging to different classes in a data set. Decision boundary Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Instance-Based Classifiers Set of Stored Cases Store the training records Use training records to predict the class label of unseen cases Atr1 AtrN Class A ... B B Unseen Case C Atr1 AtrN ... A C B Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Instance Based Classifiers Instance-based Classifiers: do not create a model but use training examples directly to classify unseen examples ( lazy classifiers). Examples: Rote-learner Memorizes entire training data and performs classification only if attributes of record match one of the training examples exactly Nearest neighbor Uses k closest points (nearest neighbors) for performing classification Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
1. Nearest-Neighbor Classifiers Unknown record Requires three things The set of stored records Distance Metric to compute distance between records The value of k, the number of nearest neighbors to retrieve To classify an unknown record: Compute distance to other training records Identify k nearest neighbors Use class labels of nearest neighbors to determine the class label of unknown record (e.g., by taking majority vote) Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Definition of Nearest Neighbor X X X (a) 1-nearest neighbor (b) 2-nearest neighbor (c) 3-nearest neighbor K-nearest neighbors of a record x are data points that have the k smallest distance to x Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Voronoi Diagrams for NN-Classifiers Each cell contains one sample, and every location within the cell is closer to that sample than to any other sample. A Voronoi diagram divides the space into such cells. Every query point will be assigned the classification of the sample within that cell. The decision boundary separates the class regions based on the 1-NN decision rule. Knowledge of this boundary is sufficient to classify new points. Remarks: Voronoi diagrams can be computed in lower dimensional spaces; in feasible for higher dimensional spaced. They also represent models for clusters that have been generate by representative-based clustering algorithms. Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Nearest Neighbor Classification Compute distance between two points: Euclidean distance = ( , ) ( 2) d p q p q i i i Determine the class from nearest neighbor list take the majority vote of class labels among the k-nearest neighbors Weigh the vote according to distance weight factor, w = 1/d2 Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Nearest Neighbor Classification Choosing the value of k: If k is too small, sensitive to noise points If k is too large, neighborhood may include points from other classes X Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Nearest Neighbor Classification Scaling issues Attributes may have to be scaled to prevent distance measures from being dominated by one of the attributes Example: height of a person may vary from 1.5m to 1.8m weight of a person may vary from 90lb to 300lb income of a person may vary from $10K to $1M Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Summary Nearest Neighbor Classifiers k-NN classifiers are lazy learners Unlike eager learners such as decision tree induction and rule-based systems, it does not build models explicitly Classifying unknown records is relatively expensive k-NN classifiers rely on a distance function; the quality of the distance function is critical for the performance of a K-NN classifier. k-NN classifiers obtain high accuracies and are quite popular in some fields, such as text data mining and in information retrieval, in general. Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
2. Support Vector Machines B1 One Possible Solution Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Support Vector Machines B2 Another possible solution Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Support Vector Machines B2 Other possible solutions Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Support Vector Machines B1 B2 Which one is better? B1 or B2? How do you define better? Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Support Vector Machines B1 B2 b21 b22 margin b11 b12 Find hyperplane maximizes the margin => B1 is better than B2 Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Support Vector Machines B1 Examples are; (x1,..,xn,y) with y {-1.1} + = 0 w x b + = + 1 w x b + = 1 w x b b11 2 b12 + 1 if w x b 1 x = Margin w = ( ) f 2|| || + 1 if w x b 1 Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Support Vector Machines 2 = We want to maximize: Margin w 2|| || 2 || || w Which is equivalent to minimizing: = ( ) L w 2 But subjected to the following N constraints: + = y w ( x b) i 1 1,.., N i i This is a constrained convex quadratic optimization problem that can be solved in polynominal time Numerical approaches to solve it (e.g., quadratic programming) exist The function to be optimized has only a single minimum no local minimum problem Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Support Vector Machines What if the problem is not linearly separable? Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Linear SVM for Non-linearly Separable Problems What if the problem is not linearly separable? Introduce slack variables Need to minimize: Parameter 2 N || || w 2 = i = + k ( ) L w C i 1 Inverse size of margin between hyperplanes Measures testing error Slack variable Subject to (i=1,..,N): ) 2 ( + ) 1 ( y * w ( x b) 1 - i i i 0 i allows constraint violation to a certain degree C is chosen using a validation set trying to keep the margins wide while keeping the training error low. Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Nonlinear Support Vector Machines What if decision boundary is not linear? Non-linear function Alternative 1: Use technique that Employs non-linear decision boundaries Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Nonlinear Support Vector Machines Transform data into higher dimensional space Find the best hyperplane using the methods introduced earlier 1. 2. Alternative 2: Transform into a higher dimensional attribute space and find linear decision boundaries in this space Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Nonlinear Support Vector Machines 1. Choose a non-linear kernel function to transform into a different, usually higher dimensional, attribute space 2. Minimize 2 2 || || w = ( ) L w Find a good hyperplane In the transformed space but subjected to the following N constraints: ( x w ( y i + = ) b) i 1 1,.., N i Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Example: Polynomial Kernal Function Polynomial Kernel Function: (x1,x2)=(x12,x22,sqrt(2)*x1,sqrt(2)*x2,1) K(u,v)= (u) (v)= (u v + 1)2 Kernel function trick: perform computations in the original space, although we solve an optimization problem in the transformed space more efficient!! A Support Vector Machine with polynomial kernel function classifies a new example z as follows: sign(( iyi (xi) (z))+b) = sign(( iyi (xi z +1)2))+b) Remark: i and b are determined using the methods for linear SVMs that were discussed earlier Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Summary Support Vector Machines Support vector machines learn hyperplanes that separate two classes maximizing the margin between them (the empty space between the instances of the two classes). Support vector machines introduce slack variables, in the case that classes are not linear separable and trying to maximize margins while keeping the training error low. The most popular versions of SVMs use non-linear kernel functions to map the attribute space into a higher dimensional space to facilitate finding good linear decision boundaries in the modified space. Support vector machines find margin optimal hyperplanes by solving a convex quadratic optimization problem. However, this optimization process is quite slow and support vector machines tend to fail if the number of examples goes beyond 500/2000/5000 In general, support vector machines accomplish quite high accuracies, if compared to other techniques. Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines
Useful Support Vector Machine Links Lecture notes are much more helpful to understand the basic ideas: http://www.ics.uci.edu/~welling/teaching/KernelsICS273B/Kernels.html http://cerium.raunvis.hi.is/~tpr/courseware/svm/kraekjur.html Some tools are often used in publications livsvm: http://www.csie.ntu.edu.tw/~cjlin/libsvm/ spider: http://www.kyb.tuebingen.mpg.de/bs/people/spider/index.html Tutorial Slides: http://www.support-vector.net/icml-tutorial.pdf Surveys: http://www.svms.org/survey/Camp00.pdf More General Material: http://www.learning-with-kernels.org/ http://www.kernel-machines.org/ http://kernel-machines.org/publications.html http://www.support-vector.net/tutorial.html Remarks: Thanks to Chaofan Sun for providing these links! Tan, Steinbach, Kumar, Eick: NN-Classifiers and Support Vector Machines