Mathematical Statistics, II

 

[The Homework Assignments]
Assignment 1

I decided upon these questions for the following reasons: First, they are either things you should have learned in MATH 321 or should be able to figure out from that course. Second, these problems can be done in multiple ways. One way is always the “brute force” method using basics. The other way is a “short cut.” Thus, these questions are all easy… and hard. Third, these cover topics that you will need in the next couple of chapters in the textbook.

As you solve these, please refer to your notes from MATH 321 and the textbook for this course. This will help you become familiar with our text and with seeing the connections between the two courses. In addition to handing in your solutions, plan to present your solutions. This will give you a better understanding of the material.

Do not use Mathematica on this assignment. Prove to yourself that you can do this by hand. Increasing your confidence in your abilities is a further goal of this assignment.

 

Problem 1: The Gamma Distribution

Recall that the probability density function for the \( GAM(\alpha, \beta) \) distribution can be written as

\[ f(x;\ \alpha,\beta) = \left\{\hspace{1em} \begin{array}{ll} \frac{\displaystyle 1}{\displaystyle \beta^\alpha\ \Gamma[\alpha]}\ x^{\alpha-1}\ e^{-x/\beta} \hspace{2em} & 0 \le x \\[1em] 0 & \text{otherwise} \end{array} \right. \]

Mathematically prove

  1. the expected value of a \( GAM(\alpha, \beta) \) distribution is \( \alpha\beta \)
  2. the variance of a \( GAM(\alpha, \beta) \) distribution is \( \alpha\beta^2 \)

 

Problem 2: The Exponential Distribution

Recall that the probability density function for the Exponential distribution can be written as

\[ f(x;\ \alpha,\beta) = \left\{\hspace{1em} \begin{array}{ll} \lambda\ e^{-\lambda x} & 0 \le x \\[1em] 0 & \text{otherwise} \end{array} \right. \]

Mathematically prove that the Exponential distribution is special case of the Gamma distribution.

 

Problem 3: The Distribution of a Product

Let \( X \) be a Bernoulli random variable with success probability \( \pi_x \). Let \( Y \) be a Bernoulli random variable with success probability \( \pi_y \). Finally, let \( X \) and \( Y \) be independent. Write the probability mass function of the product \( XY \) .

 

Problem 4: The Agresti–Coull Estimator

Let \( X \) be a Binomial random variable, \( X \sim Bin(n, \pi) \). Define

\[ T := \frac{X + 1}{n + 2} \]

Mathematically calculate \( \Bbb E[T] \) and \( \Bbb V[T] \).

 

Problem 5: The Irwin–Hall Distribution

Let \( Y \) and \( Z \) be independent standard Uniform random variables. Recall that the probability density function for a standard Uniform random variable is

\[ f(x) = \left\{\hspace{1em} \begin{array}{ll} 1 \hspace{2em} & 0 \le x \le 1 \\[1em] 0 & \text{otherwise} \end{array} \right. \]

Define the random variable \( W = Y + Z \).

  1. Mathematically determine the pdf of \( W \).
  2. Mathematically determine \( \Bbb E[W] \).
  3. Mathematically determine \( \Bbb V[W] \).

 

Problem 6: Moment-Generating Functions

Calculate the moment generating function for

  1. the Bernoulli distribution, \( Bern(\pi) \)
  2. the Binomial distribution, \( Bin(n,\ \pi) \)
  3. the standard Uniform distribution \( Unif(0,\ 1) \)
  4. the distribution of \( W \) from Problem 5

 

Problem 7: Normalizing Constants

The kernel of a distribution consists of the factors that are functions of the random variable. The probability function is the product of the kernel and a normalizing constant. The normalizing constant does not depend on the random variable and ensures that the probability function integrates (or sums) to 1. Determine the normalizing constant for each the following probability distributions:

  1. The Poisson distribution
  2. \( \hspace{2em} x\ e^{-x^2/(2\sigma^2)} \), for \( x \geq 0 \)
  3. \( \hspace{2em} ( 1 + x^2) ^{-1} \), for \( x \in (-\infty, \infty) \)

 

Problem 8: The Beta Distribution

The Beta distribution, BETA\( (\alpha, \beta) \), has probability density function

\[ f(x; \alpha,\beta) = \left\{ \hspace{2em} \begin{array}{ll} \frac{\displaystyle \Gamma[ \alpha + \beta]}{\displaystyle \Gamma[\alpha]\ \Gamma[\beta]}\ x^{\alpha-1}\ (1-x)^{\beta-1} \hspace{2em} & 0 \le x \le 1 \\[1em] 0 & \text{otherwise} \end{array} \right. \]

What distribution is equivalent to a BETA\( (1, 1) \)? Prove it.

This page was last modified on 31 March 2025.
All rights reserved by Ole J. Forsberg, PhDd, ©2008–2025. No reproduction of any of this material is allowed without explicit written permission of the copyright holder.