K-Nearest Neighbor Classifiers (KNNC)

k nearest neighbor classifiers knnc n.w
1 / 16
Embed
Share

Discover the concept, flowchart, decision boundary, demos, and characteristics of KNNC, a classification algorithm that determines the class of a point based on its nearest neighbors. Learn about feature extraction, model clustering, and evaluation, as well as preprocessing techniques like normalization. Explore natural examples of Voronoi diagrams and gain insights into the strengths and weaknesses of KNNC.

  • Neighbor Classifiers
  • Classification Algorithm
  • Feature Extraction
  • Model Evaluation
  • Normalization

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. K-Nearest Neighbor Classifiers (KNNC) J.-S. Roger Jang ( ) jang@mirlab.org http://mirlab.org/jang MIR Lab, CSIE Dept. National Taiwan University 2025/4/3

  2. Concept of KNNC Concept: Two Steps: Quiz! Find the first k nearest neighbors of a given point. Determine the class of the given point by voting among k nearest neighbors. : class-A point : class-B point : point with unknown class Feature 2 3 nearest neighbors The point is classified as B via 3NNC. 2/14 Feature 1

  3. Flowchart for KNNC Flowchart of classification: KNNC: Feature extraction From raw data to features Model Clustering (optional) construction Model evaluation KNNC evaluation on test dataset 3/14

  4. Decision Boundary for 1NNC Voronoi diagram: piecewise linear boundary Quiz! 4/14 More about Voronoi diagrams

  5. Demos by Cleve Moler Cleve s Demos of Delaunay triangles and Voronoi diagram books/dcpr/example/cleve/vshow.m 5 5/14

  6. Natural Examples of Voronoi Diagrams (1/2) D:\users\jang\books\dcpr\example\cleve\saltflat.jpg 6/14

  7. Natural Examples of Voronoi Diagrams (2/2) 7/14

  8. Characteristics of KNNC Strengths of KNNC Quiz! Intuitive No computation for model construction Weakness of KNNC Massive computation required when dataset is big No straightforward way To determine the value of K To rescale the dataset along each dimension 8/14

  9. Preprocessing of Feature Normalization Z normalization or z score To have zero mean and unit variance along each feature Range normalization To have a specific range, such as [0, 1], along each feature Quiz! Let ? = ?1,?2, ,?? be the values of a specific feature of a dataset Z normalization: ??=?? ? ,with ? and ?2 being the sample mean ? and sample variance of ? respectively Range normalization: ?? min ? max ? min ? to have a range of [0, 1] ??= 9/14

  10. Variants for KNNC Many variants of KNNC: Quiz! Nearest prototype classification Single prototype for each class Use mean or average Several prototypes for each class Use k-means clustering Distance-weighted votes Edited nearest neighbor classification k+k-nearest neighbor 10/14

  11. 1NNC Decision Boundaries 1NNC Decision boundaries 11/14

  12. 1NNC Distance/Posterior as Surfaces and Contours 12/14

  13. Using Prototypes in KNNC No. of prototypes for each class is 4. 13/14

  14. 2025/4/3 Decision Boundaries of Different Classifiers Naive Bayes classifier Quadratic classifier iris iris 25 25 20 20 sepal width sepal width 15 15 10 10 5 5 10 20 30 40 50 60 10 20 30 40 50 60 sepal length sepal length 1NNC classifier Dataset and decision boundary 25 20 15 Input_2 10 5 14/14 10 20 30 40 50 60 Input_1

  15. Exercise: KNNC Decision Boundary Given 6 samples of two classes as shown below, plot the decision boundary based on KNNC with k=1. 15/14

  16. Exercise: Nearest Prototype Classifier If we want to use the nearest prototype classifier instead of KNNC, what methods be used to find the prototype(s) for each class in the following two cases? When the number of prototype is 1 for each class. When the number of prototype is more than 1 for each class. 16/14

Related


More Related Content