Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Questions on the Expected Value of the Random Variables | MT 426, Study notes of Probability and Statistics

Material Type: Notes; Class: Probability; Subject: mathematics; University: Boston College; Term: Unknown 2009;

Typology: Study notes

Pre 2010

Uploaded on 08/31/2009

koofers-user-437
koofers-user-437 🇺🇸

10 documents

1 / 44

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
MT426 Notebook 4
prepared by Professor Jenny Baglivo
c
Copyright 2009 by Jenny A. Baglivo. All Rights Reserved.
4 MT426 Notebook 4 3
4.1 Exp ected Value of a Random Variable . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
4.1.1 Definitions ....................................... 3
4.1.2 Expectations for the Standard Models . . . . . . . . . . . . . . . . . . . . . . . . 6
4.2 Exp ected Value of a Function of a Random Variable . . . . . . . . . . . . . . . . . . . . 6
4.3 PropertiesofExpectation ................................... 10
4.4 Variance and Standard Deviation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
4.4.1 Definitions and Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
4.4.2 ChebyshevInequality ................................. 13
4.4.3 Variances for the Standard Models . . . . . . . . . . . . . . . . . . . . . . . . . . 14
4.5 Exp ected Value of a Function of a Random Pair . . . . . . . . . . . . . . . . . . . . . . 14
4.5.1 Definitions and Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
4.5.2 Covariance, Correlation, Association . . . . . . . . . . . . . . . . . . . . . . . . . 18
4.5.3 Correlations for the Standard Models . . . . . . . . . . . . . . . . . . . . . . . . 23
4.5.4 Conditional Expectation, Regression . . . . . . . . . . . . . . . . . . . . . . . . . 24
4.5.5 Historical Note: Regression To The Mean ...................... 29
4.6 Exp ected Value of a Linear Function of a Random k-Tuple ................ 31
4.6.1 MeanandVariance................................... 31
4.6.2 Covariance ....................................... 32
4.6.3 Random Sample, Sample Sum, Sample Mean . . . . . . . . . . . . . . . . . . . . 35
4.6.4 Independent Normal Random Variables . . . . . . . . . . . . . . . . . . . . . . . 36
4.7 Moment Generating Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
1
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13
pf14
pf15
pf16
pf17
pf18
pf19
pf1a
pf1b
pf1c
pf1d
pf1e
pf1f
pf20
pf21
pf22
pf23
pf24
pf25
pf26
pf27
pf28
pf29
pf2a
pf2b
pf2c

Partial preview of the text

Download Questions on the Expected Value of the Random Variables | MT 426 and more Study notes Probability and Statistics in PDF only on Docsity!

prepared by Professor Jenny Baglivo

  • MT426 Notebook
  • 4 MT426 Notebook © c Copyright 2009 by Jenny A. Baglivo. All Rights Reserved.
    • 4.1 Expected Value of a Random Variable
      • 4.1.1 Definitions
      • 4.1.2 Expectations for the Standard Models
    • 4.2 Expected Value of a Function of a Random Variable
    • 4.3 Properties of Expectation
    • 4.4 Variance and Standard Deviation
      • 4.4.1 Definitions and Properties
      • 4.4.2 Chebyshev Inequality
      • 4.4.3 Variances for the Standard Models
    • 4.5 Expected Value of a Function of a Random Pair
      • 4.5.1 Definitions and Properties
      • 4.5.2 Covariance, Correlation, Association
      • 4.5.3 Correlations for the Standard Models
      • 4.5.4 Conditional Expectation, Regression
      • 4.5.5 Historical Note: Regression To The Mean
    • 4.6 Expected Value of a Linear Function of a Random k -Tuple
      • 4.6.1 Mean and Variance
      • 4.6.2 Covariance
      • 4.6.3 Random Sample, Sample Sum, Sample Mean
      • 4.6.4 Independent Normal Random Variables
    • 4.7 Moment Generating Function

Exercise 2. Let X be the discrete random variable with PDF

p(x) = e−^2.^5

  1. 5 x x!

, for x = 0, 1 , 2 , 3 ,.. .,

and 0 otherwise. Compute E(X).

Exercise 3. Let X be the continuous random variable with PDF

f (x) =

2 x 25 , for x ∈ [0, 5],

and 0 otherwise. Compute E(X).

Exercise 4. (a) Let X be the continuous random variable with PDF f (x) =

(2 + x)^3

, for x ≥ 0,

and 0 otherwise. Compute E(X).

(b) Let X be the continuous random variable with PDF

f (x) =

(2 + x)^2

, for x ≥ 0,

and 0 otherwise. Demonstrate that E(X) is indeterminate.

Exercise. Assume that the probability of finding oil in a given drill hole is 0.30, and that the results (finding oil or not) are independent from drill hole to drill hole. An oil company drills one hole at a time. If they find oil, then they stop; otherwise, they continue. However, the company only has enough money to drill five holes. Let X be the number of holes drilled.

(a) Find the expected number of holes drilled, E(X).

(b) The company has borrowed $250,000 to buy equipment at the rate of 12% per drilling period, and have decided to pay back the loan once all drilling has been completed. Thus, g(X) = 250(1.12)X^ is the amount (in thousands of dollars) due once all drilling has been completed. Find the expected amount they will need to return, E(g(X)).

Exercise (Gambler’s Ruin). Assume that the probability of winning a game is 0.50, and that the results (win or lose) are independent from game to game. A gambler decides to play the game until (s)he wins. Let X be the number of games played.

(a) Find the expected number of games played, E(X).

(b) The gambler places a $1 bet on the first game, and, for each succeeding game, doubles the bet placed of the last game. Thus, g(X) = 2X−^1 is the amount (in dollars) bet on the last game played. Demonstrate that the expected amount bet on the last game, E(g(X)), is indeterminate.

4.3 Properties of Expectation

The following properties can be proven using properties of sums and integrals:

  1. Constant Function: If a is a constant, then E(a) = a.
  2. Linear Function: If E(X) can be determined and a and b are constants, then

E(a + bX) = a + bE(X).

  1. Linear Function of Functions: If E(gi(X)) can be determined for i = 1, 2 ,... , k, and a and bi are constants for i = 1, 2 ,... , k, then

E

a +

∑^ k

i=

bigi(X)

= a +

∑^ k

i=

biE(gi(X)).

For example, let X be an exponential random variable with parameter λ = 1. Integration by parts can be used to demonstrate that E(X) = 1, E(X^2 ) = 2 and E(X^3 ) = 6. Using these facts, we know that

E(5 + 3X − 4 X^2 + X^3 ) =.

4.4 Variance and Standard Deviation

The mean of X is a measure of the center of the distribution. The variance and standard deviation are measures of the spread of the distribution.

4.4.1 Definitions and Properties

Let X be a random variable with mean μ = E(X). Then

  1. Variance: The variance of X is defined as follows:

V ar(X) = E((X − μ)^2 ).

The notation σ^2 = V ar(X) is used to denote the variance.

  1. Standard Deviation: The standard deviation of X is defined as follows:

SD(X) =

V ar(X).

The notation σ = SD(X) is used to denote the standard deviation.

Exercise. Use the properties of expectation to demonstrate the following properties:

  1. V ar(X) = E(X^2 ) − (E(X))^2.
  2. If Y = a + bX for constants a and b, then V ar(Y ) = b^2 V ar(X) and SD(Y ) = |b|SD(X).

4.4.2 Chebyshev Inequality

Chebyshev’s inequality gives us a lower bound for the probability that X is within k standard deviations of its mean, where k is a positive constant.

Theorem (Chebyshev Inequality). Let X be a random variable with mean μ = E(X) and standard deviation σ = SD(X), and let k be a positive constant. Then

P (|X − μ| < kσ) = P (μ − kσ < X < μ + kσ) > 1 −

k^2

(Note: Equivalently, we can say that P (|X − μ| ≥ kσ) ≤ (^) k^12 .)

For example, consider the discrete random variable X whose PDF is given on the left below.

x p(x) x p(x) 1 0.02 9 0. 2 0.06 10 0. 3 0.10 11 0. 4 0.12 12 0. 5 0.14 13 0. 6 0.12 14 0. 7 0.10 15 0. 8 0.08 16 0.

For this random variable, μ = E(X) = 6.59 and σ = SD(X) = 3.33.

To illustrate the Chebyshev inequality, let k = 2.

Since

μ − 2 σ = , and

μ + 2σ = ,

the probability that X is within 2 standard deviations of its mean is

P (μ − 2 σ < X < μ + 2σ) =.

This probability is greater than the guaranteed lower bound of.

4.4.3 Variances for the Standard Models

The following table gives the variances for the standard discrete and continuous models:

Probability Model: Variance: Discrete Uniform Distribution on { 1 , 2 ,... , n} V ar(X) = n (^2) − 1 12 Hypergeometric Distribution with parameters n, M , N V ar(X) = n MN

( 1 − MN

) (^ N −n N − 1

)

Bernoulli Distribution with success probability p V ar(X) = p(1 − p) Binomial Distribution based on n trials with success probability p V ar(X) = np(1 − p) Geometric Distribution with parameter p V ar(X) = 1 −p 2 p Negative Binomial Distribution with parameters r, p V ar(X) = r(1 p− 2 p) Poisson Distribution with parameter λ V ar(X) = λ (Continuous) Uniform Distribution on the interval (a, b) V ar(X) = (b−a)

2 12 Exponential Distribution with parameter λ V ar(X) = (^) λ^12 Gamma Distribution with shape parameter α and scale parameter λ V ar(X) = (^) λα 2 Cauchy Distribution with center a and spread b V ar(X) is indeterminate Normal Distribution with mean μ and standard deviation σ V ar(X) = σ^2

4.5 Expected Value of a Function of a Random Pair

4.5.1 Definitions and Properties

Let g(X, Y ) be a real-valued function of the random pair (X, Y ).

  1. Discrete Case: If X and Y are discrete random variables with joint range R ⊆ R^2 and joint PDF p(x, y), then the expected value of g(X, Y ) is defined as follows:

E(g(X, Y )) =

(x,y)∈R

g(x, y) p(x, y),

as long as

(x,y)∈R |g(x, y)|^ p(x, y) converges. If the sum does not converge absolutely, then the expected value is said to be indeterminate.

  1. Continuous Case: If X and Y are continuous random variables with joint range R ⊆ R^2 and joint PDF f (x, y), then the expected value of g(X, Y ) is defined as follows:

E(g(X, Y )) =

R

g(x, y) f (x, y) dA,

as long as

R |g(x, y)|^ f^ (x, y)^ dA^ converges. If the integral does not converge absolutely, then the expected value is said to be indeterminate.

Exercise. Let X and Y be the discrete random variables whose joint distribu- tion is given in the table on the right.

Find E(|X − Y |).

y = 0 y = 1 y = 2 y = 3 y = 4 Sum: x = 0 0. 10 0. 04 0. 02 0. 01 0. 01 0. 18 x = 1 0. 04 0. 10 0. 04 0. 02 0. 01 0. 21 x = 2 0. 02 0. 04 0. 10 0. 04 0. 02 0. 22 x = 3 0. 01 0. 02 0. 04 0. 10 0. 04 0. 21 x = 4 0. 01 0. 01 0. 02 0. 04 0. 10 0. 18 Sum: 0. 18 0. 21 0. 22 0. 21 0. 18 1. 00

Exercise. A stick of length 5 has a coordinate system as shown below:

The stick is broken at random in two places. Let X and Y be the locations of the two breaks and assume that X and Y are independent uniform random variables on the interval (0, 5). Find the expected length of the middle segment.

First list of properties. Properties of covariance and correlation include the following

  1. Cov(X, X) = V ar(X).
  2. Cov(X, Y ) = Cov(Y, X).
  3. Cov(X, Y ) = E(XY ) − E(X)E(Y ).
  4. |Corr(X, Y )| ≤ 1.
  5. |Corr(X, Y )| = 1 if and only if Y = a + bX, except possibly on a set of probability zero.
  6. If X and Y are independent, then Cov(X, Y ) = 0 and Corr(X, Y ) = 0.

Exercise. Use properties of expectation to demonstrate that Cov(X, Y ) = E(XY )−E(X)E(Y ).

Exercise. Let X and Y be the discrete random variables whose joint distribution is given in the table on the right.

Find Cov(X, Y ) and Corr(X, Y ).

y = 0 y = 1 y = 2 Sum: x = 0 0.05 0.05 0.10 0. x = 1 0.15 0.10 0.07 0. x = 2 0.30 0.10 0.08 0. Sum: 0.50 0.25 0.25 1.