AP Statistics Curriculum 2007 Bayesian Other

From Socr

(Difference between revisions)
Jump to: navigation, search
m (Probability and Statistics Ebook - Bayesian Inference for the Binomial and Poisson Distributions)
 
Line 19: Line 19:
The posterior distribution of the [[EBook#Bernoulli_and_Binomial_Experiments |Binomial]] is  
The posterior distribution of the [[EBook#Bernoulli_and_Binomial_Experiments |Binomial]] is  
-
: <math>(P|x) \sim \beta(\alpha + x, \beta + n x)</math>.
+
: <math> (P|x) \sim \beta(\alpha+x,\beta+n-x)</math>.
===Bayesian Inference for the Poisson Distribution===
===Bayesian Inference for the Poisson Distribution===

Current revision as of 20:40, 26 October 2009

Contents

Probability and Statistics Ebook - Bayesian Inference for the Binomial and Poisson Distributions

The parameters of interest in this section is the probability P of success in a number of trials which can result in either success or failure with the trials being independent of one another and having the same probability of success. Suppose that there are n trials such that you have an observation of x successes from a binomial distribution of index n and parameter P:

x \sim B(n,P)

We can show that

p(x|P) = {n \choose x} P^x (1 - P)^{n - x}, (x = 0, 1, …, n)
p(x|P) is proportional to Px(1 − P)nx.

If the prior density has the form:

p(P) \sim P^{\alpha - 1} (P-1)^{\beta - 1}, (P between 0 and 1),

then it follows the beta distribution

P \sim \beta(\alpha,\beta).

From this we can appropriate the posterior which evidently has the form:

p(P|x) \sim P^{\alpha + x - 1} (1-P)^{\beta + n - x - 1}.

The posterior distribution of the Binomial is

 (P|x) \sim \beta(\alpha+x,\beta+n-x).

Bayesian Inference for the Poisson Distribution

A discrete random variable x is said to have a Poisson distribution of mean λ if it has the density:

P(x|\lambda) = {\lambda^x e^{-\lambda}\over x!}

Suppose that you have n observations x=(x_1, x_2, \cdots, x_n) from such a distribution so that the likelihood is:

L(λ | x) = λTe( − nλ), where T = \sum_{k_i}{x_i}.

In Bayesian inference, the conjugate prior for the parameter λ of the Poisson distribution is the Gamma distribution.

\lambda \sim \Gamma(\alpha, \beta).

The Poisson parameter λ is distributed accordingly to the parametrized Gamma density g in terms of a shape and inverse scale parameter α and β respectively:

g(\lambda|\alpha, \beta) = \displaystyle\frac{\beta^\alpha}{\Gamma(\alpha)}\lambda^{\alpha - 1} e^{-\beta \lambda}. For λ > 0.

Then, given the same sample of n measured values ki from our likelihood and a prior of Γ(α,β), the posterior distribution becomes:

\lambda \sim  \Gamma (\alpha + \displaystyle\sum_{i=1}^{\infty} k_i, \beta +n).

The posterior mean E[λ] approaches the maximum likelihood estimate in the limit as α and β approach 0.

See also

References




Translate this page:

(default)

Deutsch

Español

Français

Italiano

Português

日本語

България

الامارات العربية المتحدة

Suomi

इस भाषा में

Norge

한국어

中文

繁体中文

Русский

Nederlands

Ελληνικά

Hrvatska

Česká republika

Danmark

Polska

România

Sverige

Personal tools