# AP Statistics Curriculum 2007 Bayesian Other

(Difference between revisions)
 Revision as of 06:34, 2 June 2009 (view source)JayZzz (Talk | contribs)← Older edit Current revision as of 20:40, 26 October 2009 (view source)IvoDinov (Talk | contribs) m (→Probability and Statistics Ebook - Bayesian Inference for the Binomial and Poisson Distributions) (One intermediate revision not shown) Line 1: Line 1: - '''Bayesian Inference for the Binomial Distribution + ==[[EBook | Probability and Statistics Ebook]] - Bayesian Inference for the Binomial and Poisson Distributions== - ''' + - The parameters of interest in this section is the probability P of success in a number of trials which can result in either success or failure with the trials being independent of one another and having the same probability of success. Suppose that there are n trials such that you have an observation of x successes from a binomial distribution of index n and parameter P + The parameters of interest in this section is the probability P of success in a number of trials which can result in either success or failure with the trials being independent of one another and having the same probability of success. Suppose that there are n trials such that you have an observation of x successes from a [[EBook#Bernoulli_and_Binomial_Experiments |binomial distribution of index n and parameter P]]: + : $x \sim B(n,P)$ - x ~ B(n,P) + We can show that - + : $p(x|P) = {n \choose x} P^x (1 - P)^{n - x}$, (x = 0, 1, …, n) - subsequently, we can show that + - p(x|P) = ${n \choose x}$ $P^x$ $(1 - P)^{n - x}$ , (x = 0, 1, …, n) + - + - p(x|P) is proportional to $P^x (1 - P)^{n - x}$ + + : p(x|P) is proportional to $P^x (1 - P)^{n - x}$. If the prior density has the form: If the prior density has the form: - p(P) proportional to $P^{\alpha - 1}$$(P-1)^{\beta - 1}$ , (P between 0 and 1) + : $p(P) \sim P^{\alpha - 1} (P-1)^{\beta - 1}$, (P between 0 and 1), - + - then it follows the beta distribution + - P ~ β(α,β) + - + - + - From this we can appropriate the posterior which evidently has the form + - + - p(P|x) is proportional to $P^{\alpha + x - 1}$$(1-P)^{\beta + n - x - 1}$ + - + - The posterior distribution of the Binomial is + - + - (P|x) ~ β(α + x, β + n – x) + - + - + - + - '''Bayesian Inference for the Poisson Distribution''' + - + - A discrete random variable x is said to have a Poisson distribution of mean $\lambda$ if it has the density + + then it follows the [http://socr.ucla.edu/htmls/dist/Beta_Distribution.html beta distribution] + : $P \sim \beta(\alpha,\beta)$. - P(x|$\lambda$) = ($\lambda^x , [itex]\beta ) + In Bayesian inference, the conjugate prior for the parameter [itex]\lambda$ of the [http://socr.ucla.edu/htmls/dist/Poisson_Distribution.html Poisson distribution] is the [http://socr.ucla.edu/htmls/dist/Gamma_Distribution.html Gamma distribution]. + : $\lambda \sim \Gamma(\alpha, \beta)$. - The Poisson parameter $\lambda$ is distributed accordingly to the parameterized Gamma density g in terms of a shape and inverse scale parameter $\alpha$ and $\beta$ respectively + The Poisson parameter $\lambda$ is distributed accordingly to the parametrized Gamma density ''g'' in terms of a shape and inverse scale parameter $\alpha$ and $\beta$ respectively: + : $g(\lambda|\alpha, \beta) = \displaystyle\frac{\beta^\alpha}{\Gamma(\alpha)}\lambda^{\alpha - 1} e^{-\beta \lambda}$. For $\lambda > 0$. - g($\lambda$|$\alpha$ , $\beta$) = $\displaystyle\frac{\beta^\alpha}{\Gamma(\alpha)}$ $\lambda^{\alpha - 1} e^{-\beta \lambda}$ + Then, given the same sample of ''n'' measured values $k_i$ from our likelihood and a prior of $\Gamma(\alpha, \beta)$, the posterior distribution becomes: - For $\lambda$ > 0 + : $\lambda \sim \Gamma (\alpha + \displaystyle\sum_{i=1}^{\infty} k_i, \beta +n)$. + The posterior mean $E[\lambda]$ approaches the [[EBook#Method_of_Moments_and_Maximum_Likelihood_Estimation | maximum likelihood estimate]] in the limit as $\alpha$ and $\beta$ approach 0. - Then, given the same sample of n measured values $k_i$ from our likelihood and a prior of Gamma($\alpha$, $\beta$), the posterior distribution becomes + ==See also== + * [[EBook#Chapter_III:_Probability |Probability Chapter]] + ==References== - \lambda \sim Gamma ($\alpha + \displaystyle\sum_{i=1}^{\infty} k_i , [itex]\beta$ + n) +
+ * SOCR Home page: http://www.socr.ucla.edu - The posterior mean E[[itex]\lambda] approaches the maximum likelihood estimate in the limit as [itex]\alpha and [itex]\beta approach 0. + {{translate|pageName=http://wiki.stat.ucla.edu/socr/index.php?title=AP_Statistics_Curriculum_2007_Bayesian_Other}}

## Probability and Statistics Ebook - Bayesian Inference for the Binomial and Poisson Distributions

The parameters of interest in this section is the probability P of success in a number of trials which can result in either success or failure with the trials being independent of one another and having the same probability of success. Suppose that there are n trials such that you have an observation of x successes from a binomial distribution of index n and parameter P:

$x \sim B(n,P)$

We can show that

$p(x|P) = {n \choose x} P^x (1 - P)^{n - x}$, (x = 0, 1, …, n)
p(x|P) is proportional to Px(1 − P)nx.

If the prior density has the form:

$p(P) \sim P^{\alpha - 1} (P-1)^{\beta - 1}$, (P between 0 and 1),

then it follows the beta distribution

$P \sim \beta(\alpha,\beta)$.

From this we can appropriate the posterior which evidently has the form:

$p(P|x) \sim P^{\alpha + x - 1} (1-P)^{\beta + n - x - 1}$.

The posterior distribution of the Binomial is

$(P|x) \sim \beta(\alpha+x,\beta+n-x)$.

### Bayesian Inference for the Poisson Distribution

A discrete random variable x is said to have a Poisson distribution of mean λ if it has the density:

$P(x|\lambda) = {\lambda^x e^{-\lambda}\over x!}$

Suppose that you have n observations $x=(x_1, x_2, \cdots, x_n)$ from such a distribution so that the likelihood is:

L(λ | x) = λTe( − nλ), where $T = \sum_{k_i}{x_i}$.

In Bayesian inference, the conjugate prior for the parameter λ of the Poisson distribution is the Gamma distribution.

$\lambda \sim \Gamma(\alpha, \beta)$.

The Poisson parameter λ is distributed accordingly to the parametrized Gamma density g in terms of a shape and inverse scale parameter α and β respectively:

$g(\lambda|\alpha, \beta) = \displaystyle\frac{\beta^\alpha}{\Gamma(\alpha)}\lambda^{\alpha - 1} e^{-\beta \lambda}$. For λ > 0.

Then, given the same sample of n measured values ki from our likelihood and a prior of Γ(α,β), the posterior distribution becomes:

$\lambda \sim \Gamma (\alpha + \displaystyle\sum_{i=1}^{\infty} k_i, \beta +n)$.

The posterior mean E[λ] approaches the maximum likelihood estimate in the limit as α and β approach 0.