
Bayesian Regression: Markov Chain Monte Carlo (MCMC) Algorithms and Advantages
Explore the use of Markov Chain Monte Carlo (MCMC) algorithms in Bayesian regression for estimating posterior distributions of parameter values. Learn about the advantages of using Bayesian regression, such as incorporating previous knowledge in models and providing more intuitive interpretation of estimates. Discover the capabilities and issues of MCMC methods, including overcoming challenges like chains getting stuck and failing to converge. Gain insights into Linear Regression using HMC syntax using the package rethinking.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Introduction to Bayesian Regression Part #3 Dr Oliver Perra Full resource: https://www.ncrm.ac.uk/resources/online/all/?id=20843
Markov Chain Monte Carlo (MCMC) Algorithms to estimate posterior distribution of parameter values combinations. A series of random walks through parameter space; Walks follow a set of rules that ensures they will tend towards high probability regions of parameter space; Walks (a.k.a. Chains) should ensure an adequate exploration and representation of the underlying posterior distribution.
Markov Chain Monte Carlo (MCMC) Issues to consider: Chains can get stuck and fail to converge The chains may not be long enough to accurately represent the underlying distribution with high resolution. An efficient variation is the Hamiltonian Monte Carlo (HMC) implemented in R through Stan and Rethinking package.
Linear Regression using HMC Syntax using package rethinking ???~ ?????? ?? ,? ?? = a + b (?? ? ??) a ~ ?????? 3300 ,600 b ~ ?????? 0 ,25 ~ ??????? (0 ,1000)
Linear Regression using HMC Syntax using package rethinking
Linear Regression using HMC Plot the posterior inference against the data
Advantages of Using Bayesian Regression Incorporate previous knowledge in models: o Capitalise on previous research; Formalised assumptions; transparently reportable; Small samples provide valid estimates; More intuitive interpretation of estimates, e.g.: o Range of mean values with 89% prob.; o Range of actual outcome values with 89% prob.; All combinations of parameters ranked for their plausibility (conditional on data and model).