# Connexions

You are here: Home » Content » Applied Probability » Problems on Transform Methods

• Preface to Pfeiffer Applied Probability

### Lenses

What is a lens?

#### Definition of a lens

##### Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

##### What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

##### Who can create a lens?

Any individual member, a community, or a respected organization.

##### What are tags?

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

#### Affiliated with (What does "Affiliated with" mean?)

This content is either by members of the organizations listed or about topics related to the organizations listed. Click each link to see a list of all content affiliated with the organization.
• Rice Digital Scholarship

This collection is included in aLens by: Digital Scholarship at Rice University

Click the "Rice Digital Scholarship" link to see all content affiliated with them.

#### Also in these lenses

• UniqU content

This collection is included inLens: UniqU's lens
By: UniqU, LLC

Click the "UniqU content" link to see all content selected in this lens.

### Recently Viewed

This feature requires Javascript to be enabled.

Inside Collection:

Collection by: Paul E Pfeiffer. E-mail the author

# Problems on Transform Methods

Module by: Paul E Pfeiffer. E-mail the author

## Exercise 1

Calculate directly the generating function gX(s)gX(s) for the geometric (p)(p) distribution.

### Solution

g X ( s ) = E [ s X ] = k = 0 p k s k = p k = 0 q k s k = p 1 - q s (geometric series) g X ( s ) = E [ s X ] = k = 0 p k s k = p k = 0 q k s k = p 1 - q s (geometric series)
(1)

## Exercise 2

Calculate directly the generating function gX(s)gX(s) for the Poisson (μ)(μ) distribution.

### Solution

g X ( s ) = E [ s X ] = k = 0 p k s k = e - μ k = 0 μ k s k k ! = e - μ e μ s = e μ ( s - 1 ) g X ( s ) = E [ s X ] = k = 0 p k s k = e - μ k = 0 μ k s k k ! = e - μ e μ s = e μ ( s - 1 )
(2)

## Exercise 3

A projection bulb has life (in hours) represented by XX exponential (1/50). The unit will be replaced immediately upon failure or at 60 hours, whichever comes first. Determine the moment generating function for the time Y to replacement.

### Solution

Y = I [ 0 , a ] ( X ) X + I ( a , ) ( X ) a e s Y = I [ 0 , a ] ( X ) e s X + I ( a , ) ( X ) e a s Y = I [ 0 , a ] ( X ) X + I ( a , ) ( X ) a e s Y = I [ 0 , a ] ( X ) e s X + I ( a , ) ( X ) e a s
(3)
M Y ( s ) = 0 a e s t λ e - λ t d t + e s a a λ e - λ t d t M Y ( s ) = 0 a e s t λ e - λ t d t + e s a a λ e - λ t d t
(4)
= λ λ - s 1 - e - ( λ - s ) a + e - ( λ - s ) a = λ λ - s 1 - e - ( λ - s ) a + e - ( λ - s ) a
(5)

## Exercise 4

Simple random variable X has distribution

X = [ - 3 - 2 0 1 4 ] P X = [ 0 . 15 0 . 20 0 . 30 0 . 25 0 . 10 ] X = [ - 3 - 2 0 1 4 ] P X = [ 0 . 15 0 . 20 0 . 30 0 . 25 0 . 10 ]
(6)
1. Determine the moment generating function for X.
2. Show by direct calculation the MX'(0)=E[X]MX'(0)=E[X] and MX''(0)=E[X2]MX''(0)=E[X2].

### Solution

M X ( s ) = 0 . 15 e - 3 s + 0 . 20 e - 2 s + 0 . 30 + 0 . 25 e s + 0 . 10 e 4 s M X ( s ) = 0 . 15 e - 3 s + 0 . 20 e - 2 s + 0 . 30 + 0 . 25 e s + 0 . 10 e 4 s
(7)
M X ' ( s ) = - 3 0 . 15 e - 3 s - 2 0 . 20 e - 2 s + 0 + 0 . 25 e s + 4 0 . 10 e 4 s M X ' ( s ) = - 3 0 . 15 e - 3 s - 2 0 . 20 e - 2 s + 0 + 0 . 25 e s + 4 0 . 10 e 4 s
(8)
M X ' ' ( s ) = ( - 3 ) 2 0 . 15 e - 3 s + ( - 2 ) 2 0 . 20 e - 2 s + 0 + 0 . 25 e s + 4 2 0 . 10 e 4 s M X ' ' ( s ) = ( - 3 ) 2 0 . 15 e - 3 s + ( - 2 ) 2 0 . 20 e - 2 s + 0 + 0 . 25 e s + 4 2 0 . 10 e 4 s
(9)

Setting s=0s=0 and using e0=1e0=1 give the desired results.

## Exercise 5

Use the moment generating function to obtain the variances for the following distributions

Exponential (λ)(λ) Gamma (α,λ)(α,λ) Normal (μ,σ2)(μ,σ2)

### Solution

1. Exponential:
MX(s)=λλ-sMX'(s)=λ(λ-s)2MX''(s)=2λ(λ-s)3MX(s)=λλ-sMX'(s)=λ(λ-s)2MX''(s)=2λ(λ-s)3
(10)
E[X]=λλ2=1λE[X2]=2λλ3=2λ2 Var [X]=2λ2-1λ2=1λ2E[X]=λλ2=1λE[X2]=2λλ3=2λ2 Var [X]=2λ2-1λ2=1λ2
(11)
2. Gamma (α,λ)(α,λ):
MX(s)=λλ-sαMX'(s)=αλλ-sα-1λ(λ-s)2=αλλ-sα1λ-sMX(s)=λλ-sαMX'(s)=αλλ-sα-1λ(λ-s)2=αλλ-sα1λ-s
(12)
MX''(s)=α2λλ-sα1λ-s1λ-s+αλλ-sα1(λ-s)2MX''(s)=α2λλ-sα1λ-s1λ-s+αλλ-sα1(λ-s)2
(13)
E[X]=αλE[X2]=α2+αλ2 Var [X]=αλ2E[X]=αλE[X2]=α2+αλ2 Var [X]=αλ2
(14)
3. Normal(μ,σ)(μ,σ):
M X ( s ) = exp σ 2 s 2 2 + μ s M X ' ( s ) = M X ( s ) ( σ 2 s + μ ) M X ( s ) = exp σ 2 s 2 2 + μ s M X ' ( s ) = M X ( s ) ( σ 2 s + μ )
(15)
M X ' ' ( s ) = M X ( s ) ( σ 2 s + μ ) 2 + M X ( s ) σ 2 M X ' ' ( s ) = M X ( s ) ( σ 2 s + μ ) 2 + M X ( s ) σ 2
(16)
E [ X ] = μ E [ X 2 ] = μ 2 + σ 2 Var [ X ] = σ 2 E [ X ] = μ E [ X 2 ] = μ 2 + σ 2 Var [ X ] = σ 2
(17)

## Exercise 6

The pair {X,Y}{X,Y} is iid with common moment generating function λ3(λ-s)3λ3(λ-s)3. Determine the moment generating function for Z=2X-4Y+3Z=2X-4Y+3.

### Solution

M Z ( s ) = e 3 s λ λ - 2 s 3 λ λ + 4 s 3 M Z ( s ) = e 3 s λ λ - 2 s 3 λ λ + 4 s 3
(18)

## Exercise 7

The pair {X,Y}{X,Y} is iid with common moment generating function MX(s)=MX(s)=(0.6+0.4es)(0.6+0.4es). Determine the moment generating function for Z=5X+2YZ=5X+2Y.

### Solution

M Z ( s ) = ( 0 . 6 + 0 . 4 e 5 s ) ( 0 . 6 + 0 . 4 e 2 s ) M Z ( s ) = ( 0 . 6 + 0 . 4 e 5 s ) ( 0 . 6 + 0 . 4 e 2 s )
(19)

## Exercise 8

Use the moment generating function for the symmetric triangular distribution on (-c,c)(-c,c) as derived
in the section "Three Basic Transforms".

1. Obtain an expression for the symmetric triangular distribution on (a,b)(a,b) for any a<ba<b.
2. Use the result of part (a) to show that the sum of two independent random variables uniform on (a,b)(a,b) has symmetric triangular distribution on (2a,2b)(2a,2b).

### Solution

Let m=(a+b)/2m=(a+b)/2 and c=(b-a)/2c=(b-a)/2. If YY symetric triangular on (-c,c)(-c,c), then X=Y+mX=Y+m is symmetric triangular on (m-c,m+c)=(a,b)(m-c,m+c)=(a,b) and

M X ( s ) = e m s M Y ( s ) = e c s + e - c s - 2 c 2 s 2 e m s = e ( m + c ) s + e ( m - c ) s - 2 e m s c 2 s 2 = e b s + e a s - 2 e a + b 2 s ( b - a 2 ) 2 s 2 M X ( s ) = e m s M Y ( s ) = e c s + e - c s - 2 c 2 s 2 e m s = e ( m + c ) s + e ( m - c ) s - 2 e m s c 2 s 2 = e b s + e a s - 2 e a + b 2 s ( b - a 2 ) 2 s 2
(20)
M X + Y ( s ) = e s b - e s a s ( b - a ) 2 = e s 2 b + e s 2 a - 2 e s ( b + a ) s 2 ( b - a ) 2 M X + Y ( s ) = e s b - e s a s ( b - a ) 2 = e s 2 b + e s 2 a - 2 e s ( b + a ) s 2 ( b - a ) 2
(21)

## Exercise 9

Random variable X has moment generating function p2(1-qes)2p2(1-qes)2.

1. Use derivatives to determine E[X]E[X] and Var [X] Var [X].
2. Recognize the distribution from the form and compare E[X]E[X] and Var [X] Var [X] with the result of part (a).

### Solution

[ p 2 ( 1 - q e s ) - 2 ] ' = 2 p 2 q e s ( 1 - q e s ) 3 so that E [ X ] = 2 q / p [ p 2 ( 1 - q e s ) - 2 ] ' = 2 p 2 q e s ( 1 - q e s ) 3 so that E [ X ] = 2 q / p
(22)
[ p 2 ( 1 - q e s ) - 2 ] ' ' = 6 p 2 q 2 e s ( 1 - q e s ) 4 + 2 p 2 q e s ( 1 - q e s ) 3 so that E [ X 2 ] = 6 q 2 p 2 + 2 q p [ p 2 ( 1 - q e s ) - 2 ] ' ' = 6 p 2 q 2 e s ( 1 - q e s ) 4 + 2 p 2 q e s ( 1 - q e s ) 3 so that E [ X 2 ] = 6 q 2 p 2 + 2 q p
(23)
Var [ X ] = 2 q 2 p 2 + 2 q p = 2 ( q 2 + p q ) p 2 = 2 q p 2 Var [ X ] = 2 q 2 p 2 + 2 q p = 2 ( q 2 + p q ) p 2 = 2 q p 2
(24)

XX negative binomial (2,p)(2,p), which has E[X]=2q/pE[X]=2q/p and Var [X]=2q/p2 Var [X]=2q/p2.

## Exercise 10

The pair {X,Y}{X,Y} is independent. XX Poisson (4) and YY geometric (0.3). Determine the generating function gZ for Z=3X+2YZ=3X+2Y.

### Solution

g Z ( s ) = g X ( s 3 ) g Y ( s 2 ) = e 4 ( s 3 - 1 ) 0 . 3 1 - q s 2 g Z ( s ) = g X ( s 3 ) g Y ( s 2 ) = e 4 ( s 3 - 1 ) 0 . 3 1 - q s 2
(25)

## Exercise 11

Random variable X has moment generating function

M X ( s ) = 1 1 - 3 s exp ( 16 s 2 / 2 + 3 s ) M X ( s ) = 1 1 - 3 s exp ( 16 s 2 / 2 + 3 s )
(26)

By recognizing forms and using rules of combinations, determine E[X]E[X] and Var [X] Var [X].

### Solution

X = X 1 + X 2 with X 1 exponential(1/3) X 2 N ( 3 , 16 ) X = X 1 + X 2 with X 1 exponential(1/3) X 2 N ( 3 , 16 )
(27)
E [ X ] = 3 + 3 = 6 Var [ X ] = 9 + 16 = 25 E [ X ] = 3 + 3 = 6 Var [ X ] = 9 + 16 = 25
(28)

## Exercise 12

Random variable X has moment generating function

M X ( s ) = exp ( 3 ( e s - 1 ) ) 1 - 5 s exp ( 16 s 2 / 2 + 3 s ) M X ( s ) = exp ( 3 ( e s - 1 ) ) 1 - 5 s exp ( 16 s 2 / 2 + 3 s )
(29)

By recognizing forms and using rules of combinations, determine E[X]E[X] and Var [X] Var [X].

### Solution

X = X 1 + X 2 + X 3 , with X 1 Poisson ( 3 ) , X 2 exponential (1/5) , X 3 N ( 3 , 16 ) X = X 1 + X 2 + X 3 , with X 1 Poisson ( 3 ) , X 2 exponential (1/5) , X 3 N ( 3 , 16 )
(30)
E [ X ] = 3 + 5 + 3 = 11 Var [ X ] = 3 + 25 + 16 = 44 E [ X ] = 3 + 5 + 3 = 11 Var [ X ] = 3 + 25 + 16 = 44
(31)

## Exercise 13

Suppose the class {A,B,C}{A,B,C} of events is independent, with respective probabilities 0.3, 0.5, 0.2. Consider

X = - 3 I A + 2 I B + 4 I C X = - 3 I A + 2 I B + 4 I C
(32)
1. Determine the moment generating functions for IA,IB,ICIA,IB,IC and use properties of moment generating functions to determine the moment generating function for X.
2. Use the moment generating function to determine the distribution for X.
3. Use canonic to determine the distribution. Compare with result (b).
4. Use distributions for the separate terms; determine the distribution for the sum with mgsum3. Compare with result (b).

### Solution

M X ( s ) = ( 0 . 7 + 0 . 3 e - 3 s ) ( 0 . 5 + 0 . 5 e 2 s ) ( 0 . 8 + 0 . 2 e 4 s ) = M X ( s ) = ( 0 . 7 + 0 . 3 e - 3 s ) ( 0 . 5 + 0 . 5 e 2 s ) ( 0 . 8 + 0 . 2 e 4 s ) =
(33)
0 . 12 e - 3 s + 0 . 12 e - s + 0 . 28 + 0 . 03 e s + 0 . 28 e 2 s + 0 . 03 e 3 s + 0 . 07 e 4 s + 0 . 07 e 6 s 0 . 12 e - 3 s + 0 . 12 e - s + 0 . 28 + 0 . 03 e s + 0 . 28 e 2 s + 0 . 03 e 3 s + 0 . 07 e 4 s + 0 . 07 e 6 s
(34)

The distribution is

X = [ - 3 - 1 0 1 2 3 4 6 ] P X = [ 0 . 12 0 . 12 0 . 28 0 . 03 0 . 28 0 . 03 0 . 07 0 . 07 ] X = [ - 3 - 1 0 1 2 3 4 6 ] P X = [ 0 . 12 0 . 12 0 . 28 0 . 03 0 . 28 0 . 03 0 . 07 0 . 07 ]
(35)
c = [-3 2 4 0];
P = 0.1*[3 5 2];
canonic
Enter row vector of coefficients  c
Enter row vector of minterm probabilities  minprob(P)
Use row matrices X and PX for calculations
Call for XDBN to view the distribution
P1 = [0.7 0.3];
P2 = [0.5 0.5];
P3 = [0.8 0.2];
X1 = [0 -3];
X2 = [0 2];
X3 = [0 4];
[x,px] = mgsum3(X1,X2,X3,P1,P2,P3);
disp([X;PX;x;px]')
-3.0000    0.1200   -3.0000    0.1200
-1.0000    0.1200   -1.0000    0.1200
0    0.2800         0    0.2800
1.0000    0.0300    1.0000    0.0300
2.0000    0.2800    2.0000    0.2800
3.0000    0.0300    3.0000    0.0300
4.0000    0.0700    4.0000    0.0700
6.0000    0.0700    6.0000    0.0700


## Exercise 14

Suppose the pair {X,Y}{X,Y} is independent, with both X and Y binomial. Use generating functions to show under what condition, if any, X+YX+Y is binomial.

### Solution

Binomial iff both have same p, as shown below.

g X + Y ( s ) = ( q 1 + p 1 s ) n ( q 2 + p 2 s ) m = ( q + p s ) n + m iff p 1 = p 2 g X + Y ( s ) = ( q 1 + p 1 s ) n ( q 2 + p 2 s ) m = ( q + p s ) n + m iff p 1 = p 2
(36)

## Exercise 15

Suppose the pair {X,Y}{X,Y} is independent, with both X and Y Poisson.

1. Use generating functions to show under what condition X+YX+Y is Poisson.

### Solution

Always Poisson, as the argument below shows.

g X + Y ( s ) = e μ ( s - 1 ) e ν ( s - 1 ) = e ( μ + ν ) ( s - 1 ) g X + Y ( s ) = e μ ( s - 1 ) e ν ( s - 1 ) = e ( μ + ν ) ( s - 1 )
(37)

However, Y-XY-X could have negative values.

## Exercise 16

Suppose the pair {X,Y}{X,Y} is independent, Y is nonnegative integer-valued, X is Poisson and X+YX+Y is Poisson. Use the generating functions to show that Y is Poisson.

### Solution

E[X+Y]=μ+νE[X+Y]=μ+ν, where ν=E[Y]>0ν=E[Y]>0. gX(s)=eμ(s-1)gX(s)=eμ(s-1) and gX+Y(s)=gX(s)gY(s)=e(μ+ν)(s-1)gX+Y(s)=gX(s)gY(s)=e(μ+ν)(s-1). Division by gX(s)gX(s) gives gY(s)=eν(s-1)gY(s)=eν(s-1).

## Exercise 17

Suppose the pair {X,Y}{X,Y} is iid, binomial (6,0.51)(6,0.51). By the result of Exercise 14

X+YX+Y is binomial. Use mgsum to obtain the distribution for Z=2X+4YZ=2X+4Y. Does Z have the binomial distribution? Is the result surprising? Examine the first few possible values for Z. Write the generating function for Z; does it have the form for the binomial distribution?

### Solution

x  = 0:6;
px = ibinom(6,0.51,x);
[Z,PZ] = mgsum(2*x,4*x,px,px);
disp([Z(1:5);PZ(1:5)]')
0    0.0002       % Cannot be binomial, since odd values missing
2.0000    0.0012
4.0000    0.0043
6.0000    0.0118
8.0000    0.0259
- - - - - - - -

g X ( s ) = g Y ( s ) = ( 0 . 49 + 0 . 51 s ) 6 g Z ( s ) = ( 0 . 49 + 0 . 51 s 2 ) 6 ( 0 . 49 + 0 . 51 s 4 ) 6 g X ( s ) = g Y ( s ) = ( 0 . 49 + 0 . 51 s ) 6 g Z ( s ) = ( 0 . 49 + 0 . 51 s 2 ) 6 ( 0 . 49 + 0 . 51 s 4 ) 6
(38)

## Exercise 18

Suppose the pair {X,Y}{X,Y} is independent, with XX binomial (5,0.33)(5,0.33) and

YY binomial (7,0.47)(7,0.47).

Let G=g(X)=3X2-2XG=g(X)=3X2-2X and H=h(Y)=2Y2+Y+3H=h(Y)=2Y2+Y+3.

1. Use the mgsum to obtain the distribution for G+HG+H.
2. Use icalc and csort to obtain the distribution for G+HG+H and compare with the result of part (a).

### Solution

X = 0:5;
Y = 0:7;
PX = ibinom(5,0.33,X);
PY = ibinom(7,0.47,Y);
G = 3*X.^2 - 2*X;
H = 2*Y.^2 + Y + 3;
[Z,PZ] = mgsum(G,H,PX,PY);

icalc
Enter row matrix of X-values  X
Enter row matrix of Y-values  Y
Enter X probabilities  PX
Enter Y probabilities  PY
Use array operations on matrices X, Y, PX, PY, t, u, and P
M = 3*t.^2 - 2*t + 2*u.^2 + u + 3;
[z,pz] = csort(M,P);
e = max(abs(pz - PZ))  % Comparison of p values
e =  0


## Exercise 19

Suppose the pair {X,Y}{X,Y} is independent, with XX binomial (8,0.39)(8,0.39) and

YY uniform on {-1.3,-0.5,1.3,2.2,3.5}{-1.3,-0.5,1.3,2.2,3.5}. Let

U = 3 X 2 - 2 X + 1 and V = Y 3 + 2 Y - 3 U = 3 X 2 - 2 X + 1 and V = Y 3 + 2 Y - 3
(39)
1. Use mgsum to obtain the distribution for U+VU+V.
2. Use icalc and csort to obtain the distribution for U+VU+V and compare with the result of part (a).

### Solution

X = 0:8;
Y = [-1.3 -0.5 1.3 2.2 3.5];
PX = ibinom(8,0.39,X);
PY = (1/5)*ones(1,5);
U  = 3*X.^2 - 2*X + 1;
V  = Y.^3 + 2*Y - 3;
[Z,PZ] = mgsum(U,V,PX,PY);
icalc
Enter row matrix of X-values  X
Enter row matrix of Y-values  Y
Enter X probabilities  PX
Enter Y probabilities  PY
Use array operations on matrices X, Y, PX, PY, t, u, and P
M = 3*t.^2 - 2*t + 1 + u.^3 + 2*u - 3;
[z,pz] = csort(M,P);
e = max(abs(pz - PZ))
e = 0


## Exercise 20

If X is a nonnegative integer-valued random variable, express the generating function as a power series.

1. Show that the kth derivative at s=1s=1 is
gX(k)(1)=E[X(X-1)(X-2)(X-k+1)]gX(k)(1)=E[X(X-1)(X-2)(X-k+1)]
(40)
2. Use this to show the Var [X]=gX''(1)+gX'(1)-[gX'(1)]2 Var [X]=gX''(1)+gX'(1)-[gX'(1)]2.

### Solution

Since power series may be differentiated term by term

g X ( n ) ( s ) = k = n k ( k - 1 ) ( k - n + 1 ) p k s k - n so that g X ( n ) ( s ) = k = n k ( k - 1 ) ( k - n + 1 ) p k s k - n so that
(41)
g X ( n ) ( 1 ) = k = n k ( k - 1 ) ( k - n + 1 ) p k = E [ X ( X - 1 ) ( X - n + 1 ) ] g X ( n ) ( 1 ) = k = n k ( k - 1 ) ( k - n + 1 ) p k = E [ X ( X - 1 ) ( X - n + 1 ) ]
(42)
Var [ X ] = E [ X 2 ] - E 2 [ X ] = E [ X ( X - 1 ) ] + E [ X ] - E 2 [ X ] = g X ' ' ( 1 ) + g X ' ( 1 ) - [ g X ' ( 1 ) ] 2 Var [ X ] = E [ X 2 ] - E 2 [ X ] = E [ X ( X - 1 ) ] + E [ X ] - E 2 [ X ] = g X ' ' ( 1 ) + g X ' ( 1 ) - [ g X ' ( 1 ) ] 2
(43)

## Exercise 21

Let MX()MX() be the moment generating function for X.

1. Show that Var [X] Var [X] is the second derivative of e-sμMX(s)e-sμMX(s) evaluated at s=0s=0.
2. Use this fact to show that if XN(μ,σ2)XN(μ,σ2), then Var [X]=σ2 Var [X]=σ2.

### Solution

f ( s ) = e - s μ M X ( s ) f ' ' ( s ) = e - s μ [ - μ M X ' ( s ) + μ 2 M X ( s ) + M X ' ' ( s ) - μ M X ' ( s ) ] f ( s ) = e - s μ M X ( s ) f ' ' ( s ) = e - s μ [ - μ M X ' ( s ) + μ 2 M X ( s ) + M X ' ' ( s ) - μ M X ' ( s ) ]
(44)

Setting s=0s=0 and using the result on moments gives

f ' ' ( 0 ) = - μ 2 + μ 2 + E [ X 2 ] - μ 2 = Var [ X ] f ' ' ( 0 ) = - μ 2 + μ 2 + E [ X 2 ] - μ 2 = Var [ X ]
(45)

## Exercise 22

Use derivatives of MXm(s)MXm(s) to obtain the mean and variance of the negative binomial (m,p)(m,p) distribution.

### Solution

To simplify writing use f(s)f(s) for MX(S)MX(S).

f ( s ) = p m ( 1 - q e s ) m f ' ( s ) = m p m q e s ( 1 - q e s ) m + 1 f ' ' ( s ) = m p m q e s ( 1 - q e s ) m + 1 + m ( m + 1 ) p m q 2 e 2 s ( 1 - q e s ) m + 2 f ( s ) = p m ( 1 - q e s ) m f ' ( s ) = m p m q e s ( 1 - q e s ) m + 1 f ' ' ( s ) = m p m q e s ( 1 - q e s ) m + 1 + m ( m + 1 ) p m q 2 e 2 s ( 1 - q e s ) m + 2
(46)
E [ X ] = m p m q ( 1 - q ) m + 1 = m q p E [ X 2 ] = m q p + m ( m + 1 ) p m q 2 ( 1 - q ) m + 2 E [ X ] = m p m q ( 1 - q ) m + 1 = m q p E [ X 2 ] = m q p + m ( m + 1 ) p m q 2 ( 1 - q ) m + 2
(47)
Var [ X ] = m q p + m ( m + 1 ) q 2 p 2 - m 2 q 2 p 2 = m q p 2 Var [ X ] = m q p + m ( m + 1 ) q 2 p 2 - m 2 q 2 p 2 = m q p 2
(48)

## Exercise 23

Use moment generating functions to show that variances add for the sum or difference of independent random variables.

### Solution

To simplify writing, set f(s)=MX(s)f(s)=MX(s), g(s)=MY(s)g(s)=MY(s), and h(s)=MX(s)MY(s)h(s)=MX(s)MY(s)

h ' ( s ) = f ' ( s ) g ( s ) + f ( s ) g ' ( s ) h ' ' ( s ) = f ' ' ( s ) g ( s ) + f ' ( s ) g ' ( s ) + f ' ( s ) g ' ( s ) + f ( s ) g ' ' ( s ) h ' ( s ) = f ' ( s ) g ( s ) + f ( s ) g ' ( s ) h ' ' ( s ) = f ' ' ( s ) g ( s ) + f ' ( s ) g ' ( s ) + f ' ( s ) g ' ( s ) + f ( s ) g ' ' ( s )
(49)

Setting s=0s=0 yields

E [ X + Y ] = E [ X ] + E [ Y ] E [ ( X + Y ) 2 ] = E [ X 2 ] + 2 E [ X ] E [ Y ] + E [ Y 2 ] E 2 [ X + Y ] = E [ X + Y ] = E [ X ] + E [ Y ] E [ ( X + Y ) 2 ] = E [ X 2 ] + 2 E [ X ] E [ Y ] + E [ Y 2 ] E 2 [ X + Y ] =
(50)
E 2 [ X ] + 2 E [ X ] E [ Y ] + E 2 [ Y ] E 2 [ X ] + 2 E [ X ] E [ Y ] + E 2 [ Y ]
(51)

Taking the difference gives Var [X+Y]= Var [X]+ Var [Y] Var [X+Y]= Var [X]+ Var [Y]. A similar treatment with g(s)g(s) replaced by g(-s)g(-s) shows Var [X-Y]= Var [X]+ Var [Y] Var [X-Y]= Var [X]+ Var [Y].

## Exercise 24

The pair {X,Y}{X,Y} is iid N(3,5)N(3,5). Use the moment generating function to show that Z=3X-2Y+3Z=3X-2Y+3 is is normal (see Example 3 from "Transform Methods" for general result).

### Solution

M 3 X ( s ) = M X ( 3 s ) = exp 9 5 s 2 2 + 3 3 s M - 2 Y ( s ) = M Y ( - 2 s ) = exp 4 5 s 2 2 - 2 3 s M 3 X ( s ) = M X ( 3 s ) = exp 9 5 s 2 2 + 3 3 s M - 2 Y ( s ) = M Y ( - 2 s ) = exp 4 5 s 2 2 - 2 3 s
(52)
M Z ( s ) = e 3 s exp ( 45 + 20 ) s 2 2 + ( 9 - 6 ) s = exp 65 s 2 2 + 6 s M Z ( s ) = e 3 s exp ( 45 + 20 ) s 2 2 + ( 9 - 6 ) s = exp 65 s 2 2 + 6 s
(53)

## Exercise 25

Use the central limit theorem to show that for large enough sample size (usually 20 or more), the sample average

A n = 1 n i = 1 n X i A n = 1 n i = 1 n X i
(54)

is approximately N(μ,σ2/n)N(μ,σ2/n) for any reasonable population distribution having mean value μ and variance σ2.

### Solution

E [ A n ] = 1 n i = 1 n μ = μ Var [ A n ] = 1 n 2 i = 1 n σ 2 = σ 2 n E [ A n ] = 1 n i = 1 n μ = μ Var [ A n ] = 1 n 2 i = 1 n σ 2 = σ 2 n
(55)

By the central limit theorem, An is approximately normal, with the mean and variance above.

## Exercise 26

A population has standard deviation approximately three. It is desired to determine the sample size n needed to ensure that with probability 0.95 the sample average will be within 0.5 of the mean value.

1. Use the Chebyshev inequality to estimate the needed sample size.
2. Use the normal approximation to estimate n (see Example 1 from "Simple Random Samples and Statistics").

### Solution

• Chebyshev inequality:
P | A n - μ | σ / n 0 . 5 n 3 3 2 0 . 5 2 n 0 . 05 implies n 720 P | A n - μ | σ / n 0 . 5 n 3 3 2 0 . 5 2 n 0 . 05 implies n 720
(56)
• Normal approximation: Use of the table in Example 1 from "Simple Random Samples and Statistics" shows
n(3/0.5)23.84=128n(3/0.5)23.84=128
(57)

## Content actions

PDF | EPUB (?)

### What is an EPUB file?

EPUB is an electronic book format that can be read on a variety of mobile devices.

PDF | EPUB (?)

### What is an EPUB file?

EPUB is an electronic book format that can be read on a variety of mobile devices.

#### Collection to:

My Favorites (?)

'My Favorites' is a special kind of lens which you can use to bookmark modules and collections. 'My Favorites' can only be seen by you, and collections saved in 'My Favorites' can remember the last module you were on. You need an account to use 'My Favorites'.

| A lens I own (?)

#### Definition of a lens

##### Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

##### What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

##### Who can create a lens?

Any individual member, a community, or a respected organization.

##### What are tags?

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

| External bookmarks

#### Module to:

My Favorites (?)

'My Favorites' is a special kind of lens which you can use to bookmark modules and collections. 'My Favorites' can only be seen by you, and collections saved in 'My Favorites' can remember the last module you were on. You need an account to use 'My Favorites'.

| A lens I own (?)

#### Definition of a lens

##### Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

##### What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

##### Who can create a lens?

Any individual member, a community, or a respected organization.

##### What are tags?

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

| External bookmarks