
Supervised Learning Basics
Explore the fundamental concepts of supervised learning, including types of models, loss functions, objectives, and optimization techniques. Discover how function approximation ties these concepts together in the realm of machine learning.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Machine Learning Model overview
New developments (if you are interested in research) We can design supervised training tasks for unlabeled data Self-supervised learning: generate labels from data, e.g., word2vec, BERT GAN: generating fake data with trivial label from unlabeled data Contrastive learning learning the best vector representation of non- vector data Similar kinds positive labels Different kinds negative labels
Components in supervised learning (from optimization s angle) Models: a parameterized function to map inputs to label function approximation An unknown function Fw(x), where w are model parameters Estimated (learned) function ??(?) Loss: the measure of how good the model does in terms of predicting the outcome E.g., in house price prediction: (predicted_price sold_price)2 Objective: the goal is to optimize the model E.g., by minimizing the sum of losses over training examples E.g., in house price prediction: minimize ( sum of (predicted_price_i sold_price_i)2, for i=1..n examples Optimization: the algorithm for solving the objective
Tricky parts loss functions Loss function directly related to the specific ML problem Common loss functions Mean squared errors for regression Cross-entropy --> for logistic/softmax regression and classification Hinge-loss Support vector machine Etc. we will discuss more details later
Types of Models Linear methods: decision is made from a linear combination of input features Decision trees: use trees to make decisions Probabilistic models, e.g., na ve bayes classifier: bayes rule Kernel machines, e.g., SVM and kNN: use kernel function to compute feature similarity Neural networks: use NN to learn feature representation
Summary Unsupervised Learning Supervised Learning Semi-supervised Learning Reinforcement Learning Models Loss function Objective Optimization Neural Networks Decision Trees Linear Models Na ve Bayes Kernel machines
Question How does the idea of function approximation link the major concepts in supervised learning (loss, objective, and optimization)?