
General Linear Model Regression Analysis
Learn about the principles and application of General Linear Model in regression analysis. Explore topics such as regressors, design matrix, covariates, and error analysis through Ordinary Least Squares method. Understand how to model and analyze data using GLM techniques for effective interpretation of results.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
General Linear Model regressors 1 2 . . . 1 2 . . . Y1 Y2 . . . X11 X1l X1L X21 X2l X2L . . . = + L J YJ XJ1 XJl XJL time points time points time points regressors Y = X * + Design Matrix Observed data Parameters Residuals/Error
Design Matrix 0 0 0 0 0 0 0 rest task Conditions On Off 1 1 1 1 1 1 1 Off On Use dummy codes to label different levels of an experimental factor (eg. On = 1, Off = 0). time
Design Matrix 5 4 4 2 3 1 6 3 1 6 5 2 Covariates Parametric variation of a single variable (eg. Task difficulty = 1-6) or measured values of a variable (eg. Movement).
Design Matrix Constant Variable 1 1 1 1 1 1 1 1 . . . Models the baseline activity (eg. Always = 1)
Design Matrix Time Regressors The design matrix should include everything that might explain the data.
General Linear Model regressors 1 2 . . . 1 2 . . . Y1 Y2 . . . X11 X1l X1L X21 X2l X2L . . . = + L J YJ XJ1 XJl XJL time points time points time points regressors Y = X * + Design Matrix Observed data Parameters Residuals/Error
Error Independent and identically distributed 2 iid ~ , 0 ( N )
Ordinary Least Squares 35 Residual sum of square: 30 The sum of the square difference between actual value and fitted value. 25 20 e 15 10 5 0 0 5 10 15
Ordinary Least Squares 35 30 N = t 2= minimum te 25 1 20 15 e 10 5 0 0 5 10 15 -5
Ordinary Least Squares Y = X +e e = Y-X y e X XTe=0 => XT(Y-X )=0 => XTY-XTX =0 => XTX =XTY => =(XTX)-1XTY x1 1 x2 2
fMRI Y = X * + Observed data Design Matrix Parameters Residuals/Error 12
The Convolution Model Expected BOLD HRF Impulses =
Convolve stimulus function with a canonical hemodynamic response function (HRF): Original Convolved HRF HRF
Noise Low-frequency noise Solution: High pass filtering
discrete cosine transform (DCT) set blue black green = data = mean + low-frequency drift = predicted response, taking into account low-frequency drift = predicted response, NOT taking into account low-frequency drift red
Assumptions of GLM using OLS All About Error 2I ~ , 0 ( N ) e
Unbiasedness Expected value of beta = beta
Autoregressive Model y = X + e over time et= aet-1+ autocovariance function a should = 0
Thanks to Dr. Guillaume Flandin
References http://www.fil.ion.ucl.ac.uk/spm/doc/books/hbf2/pdfs/Ch7.pdf http://www.fil.ion.ucl.ac.uk/spm/course/slides10- vancouver/02_General_Linear_Model.pdf Previous MfD presentations