Gibbs sampling

Gibbs sampling

In statistics, Gibbs sampling or a Gibbs sampler is a Markov chain Monte Carlo (MCMC) algorithm for obtaining a sequence of observations which are approximated from a specified multivariate probability distribution, when direct sampling is difficult. This sequence can be used to approximate the joint distribution (e.g., to generate a histogram of the distribution); to approximate the marginal distribution of one of the variables, or some subset of the variables (for example, the unknown parameters or latent variables); or to compute an integral (such as the expected value of one of the variables). Typically, some of the variables correspond to observations whose values are known, and hence do not need to be sampled.

Comment
enIn statistics, Gibbs sampling or a Gibbs sampler is a Markov chain Monte Carlo (MCMC) algorithm for obtaining a sequence of observations which are approximated from a specified multivariate probability distribution, when direct sampling is difficult. This sequence can be used to approximate the joint distribution (e.g., to generate a histogram of the distribution); to approximate the marginal distribution of one of the variables, or some subset of the variables (for example, the unknown parameters or latent variables); or to compute an integral (such as the expected value of one of the variables). Typically, some of the variables correspond to observations whose values are known, and hence do not need to be sampled.
Depiction
Gibbs sampler picture.jpg
Gibbs sampling info eq.jpg
Has abstract
enIn statistics, Gibbs sampling or a Gibbs sampler is a Markov chain Monte Carlo (MCMC) algorithm for obtaining a sequence of observations which are approximated from a specified multivariate probability distribution, when direct sampling is difficult. This sequence can be used to approximate the joint distribution (e.g., to generate a histogram of the distribution); to approximate the marginal distribution of one of the variables, or some subset of the variables (for example, the unknown parameters or latent variables); or to compute an integral (such as the expected value of one of the variables). Typically, some of the variables correspond to observations whose values are known, and hence do not need to be sampled. Gibbs sampling is commonly used as a means of statistical inference, especially Bayesian inference. It is a randomized algorithm (i.e. an algorithm that makes use of random numbers), and is an alternative to deterministic algorithms for statistical inference such as the expectation-maximization algorithm (EM). As with other MCMC algorithms, Gibbs sampling generates a Markov chain of samples, each of which is correlated with nearby samples. As a result, care must be taken if independent samples are desired. Generally, samples from the beginning of the chain (the burn-in period) may not accurately represent the desired distribution and are usually discarded.
Hypernym
Algorithm
Is primary topic of
Gibbs sampling
Label
enGibbs sampling
Link from a Wikipage to an external page
turing.ml/
Link from a Wikipage to another Wikipage
Adaptive rejection sampling
Algorithm
American Mathematical Society
Andrew Gelman
Autocorrelation
Bayes estimator
Bayesian analysis
Bayesian inference
Bayesian learning
Bayesian network
Bayesian statistics
Black magic (programming)
Cartesian product
Categorical distribution
Category:Markov chain Monte Carlo
Chapman & Hall
Church (programming language)
Compound distribution
Conditional distribution
Conjugate prior
Continuous distribution
Curse of dimensionality
Detailed balance equations
Deterministic algorithm
Deterministically
Dirichlet distribution
Dirichlet-multinomial distribution
Discrete distribution
Donald Geman
Elizabeth Wilmer
Equivalence relation
Expectation maximization
Expectation-maximization
Expectation-maximization algorithm
Expected value
Exponential family
File:Gibbs sampler picture.jpg
File:Gibbs sampling info eq.jpg
Forward-backward algorithm
Gamma distribution
Gaussian distribution
Generalized linear model
Generalized linear models
Graphical model
Hidden Markov model
Hierarchical Bayesian model
Independent identically distributed
Integral
Inverse gamma distribution
Joint distribution
Josiah Willard Gibbs
Journal of the American Statistical Association
Julia (programming language)
Just another Gibbs sampler
Latent Dirichlet allocation
Latent variable
Linear regression
Logarithmically concave function
Logistic function
Logistic regression
Marginal distribution
Markov chain
Markov chain Monte Carlo
Markov Chains and Mixing Times
Maximum a posteriori
Maximum entropy classifier
Mean
Metropolis–Hastings
Metropolis–Hastings algorithm
Mixture model
Mode (statistics)
Multivariate distribution
Natural language processing
Negative binomial distribution
Normal distribution
Normalization constant
OpenBUGS
Parameter
Poisson distribution
Posterior distribution
Posterior predictive distribution
Posterior probability
Prior distribution
Probabilistic programming
Probability distribution
Probit regression
PyMC
Python (programming language)
Randomized algorithm
Random number generation
Random variables
Random walk
Sample mean
Sample variance
Sampling (statistics)
Semi-supervised learning
Simulated annealing
Skewness
Slice sampling
Stationary distribution
Statistical inference
Statistical physics
Statistics
Stuart Geman
Student's t-distribution
Supervised learning
Topic model
Unsupervised learning
Variance
Variational Bayes
Yuval Peres
SameAs
Amostragem de Gibbs
Campionamento di Gibbs
Échantillonnage de Gibbs
EZBC
Gibbs sampling
Gibbs-Sampling
m.02jxh7
Muestreo de Gibbs
Q1191905
Семплирование по Гиббсу
نمونه‌برداری گیبز
ギブスサンプリング
吉布斯采样
기브스 표집
Subject
Category:Markov chain Monte Carlo
Thumbnail
Gibbs sampler picture.jpg?width=300
WasDerivedFrom
Gibbs sampling?oldid=1118061637&ns=0
WikiPageLength
37415
Wikipage page ID
509709
Wikipage revision ID
1118061637
WikiPageUsesTemplate
Template:Bayesian statistics
Template:Citation
Template:Cite journal
Template:ISBN
Template:Reflist
Template:Short description