Clustering Algorithms

Clustering Algorithms
Slide Note
Embed
Share

Using clustering algorithms for structure discovery in data points, without prior knowledge of the structure. Exploring k-Means and other clustering techniques to group data points based on similarities. Illustrations and explanations included.

  • Clustering
  • Algorithms
  • Data Points
  • Structure Discovery
  • Dimensionality Reduction

Uploaded on Apr 03, 2025 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Week 7 Video 1 Clustering

  2. Clustering A type of Structure Discovery algorithm This type of method is also referred to as Dimensionality Reduction, based on a common application

  3. Clustering You have a large number of data points You want to find what structure there is among the data points You don t know anything a priori about the structure Clustering tries to find data points that group together

  4. Trivial Example Let s say your data has two variables Probability the student knows the skill from BKT (Pknow) Unitized Time Note: clustering works for (and is effective in) large feature spaces

  5. +3 time 0 -3 0 1 pknow

  6. k-Means Clustering Algorithm +3 time 0 -3 0 1 pknow

  7. Not the only clustering algorithm Just the simplest We ll discuss fancier ones as the week goes on

  8. How did we get these clusters? First we decided how many clusters we wanted, 5 How did we do that? More on this in the next lecture We picked starting values for the centroids of the clusters Usually chosen randomly Sometimes there are good reasons to start with specific initial values

  9. +3 time 0 -3 0 1 pknow

  10. Then We classify every point as to which centroid it s closest to This defines the clusters Typically visualized as a voronoi diagram

  11. +3 time 0 -3 0 1 pknow

  12. Then We re-fit the centroids as the center of the points in each cluster

  13. +3 time 0 -3 0 1 pknow

  14. Then Repeat the process until the centroids stop moving Convergence

  15. +3 time 0 -3 0 1 pknow

  16. +3 time 0 -3 0 1 pknow

  17. +3 time 0 -3 0 1 pknow

  18. +3 time 0 -3 0 1 pknow

  19. +3 time 0 -3 0 1 pknow

  20. Note that there are some outliers +3 time 0 -3 0 1 pknow

  21. What if we start with these points? +3 time 0 -3 0 1 pknow

  22. Not very good clusters +3 time 0 -3 0 1 pknow

  23. What happens? What happens if your starting points are in strange places? Not trivial to avoid, considering the full span of possible data distributions

  24. One Solution Run several times, involving different starting points cf. Conati & Amershi (2009)

  25. Exercises Take the following examples (The slides will be available in course materials so you can work through them) And execute k-means for them Do this by hand Focus on getting the concept rather than the exact right answer (Solutions are by hand rather than actually using code, and are not guaranteed to be perfect)

  26. Exercise 7-1-1 +3 time 0 -3 0 1 pknow

  27. Pause Here with In-Video Quiz Do this yourself if you want to Only quiz option: go ahead

  28. Solution Step 1 +3 time 0 -3 0 1 pknow

  29. Solution Step 2 +3 time 0 -3 0 1 pknow

  30. Solution Step 3 +3 time 0 -3 0 1 pknow

  31. Solution Step 4 +3 time 0 -3 0 1 pknow

  32. Solution Step 5 +3 time 0 -3 0 1 pknow

  33. No points switched -- convergence +3 time 0 -3 0 1 pknow

  34. Notes K-Means did pretty reasonable here

  35. Exercise 7-1-2 +3 time 0 -3 0 1 pknow

  36. Pause Here with In-Video Quiz Do this yourself if you want to Only quiz option: go ahead

  37. Solution Step 1 +3 time 0 -3 0 1 pknow

  38. Solution Step 2 +3 time 0 -3 0 1 pknow

  39. Solution Step 3 +3 time 0 -3 0 1 pknow

  40. Solution Step 4 +3 time 0 -3 0 1 pknow

  41. Solution Step 5 +3 time 0 -3 0 1 pknow

  42. Notes The three clusters in the same data lump might move around for a little while But really, what we have here is one cluster and two outliers k should be 3 rather than 5 See next lecture to learn more

  43. Exercise 7-1-3 +3 time 0 -3 0 1 pknow

  44. Pause Here with In-Video Quiz Do this yourself if you want to Only quiz option: go ahead

  45. Solution +3 time 0 -3 0 1 pknow

  46. Notes The bottom-right cluster is actually empty! There was never a point where that centroid was actually closest to any point

  47. Exercise 7-1-4 +3 time 0 -3 0 1 pknow

  48. Pause Here with In-Video Quiz Do this yourself if you want to Only quiz option: go ahead

  49. Solution Step 1 +3 time 0 -3 0 1 pknow

  50. Solution Step 2 +3 time 0 -3 0 1 pknow

More Related Content