Consider the random variable YY
formed as the sum of two independent random variables
X1X1
and
X2X2
:
where
X1X1
has pdf
f1
x1
f1
x1
and
X2X2
has pdf
f2
x2
f2
x2
.
We can write the joint pdf for yy
and
x1x1
by rewriting the conditional probability formula:
fy
x1
=fy
x1
f1
x1
f
y
x1
f

y
x1
f1
x1
(2)
It is clear that the event '
YY
takes the value
yy conditional upon
X1
=
x1
X1
x1
' is equivalent to
X2X2
taking a value
y−
x1
y
x1
(since
X2
=Y−
X1
X2
Y
X1
). Hence
fy
x1
=
f2
y−
x1
f

y
x1
f2
y
x1
(3)
Now
fy
f
y
may be obtained using the
Marginal
Probability formula (
(Reference) from
(Reference)). Hence
fy=∫fy
x1
f1
x1
d
x1
=∫
f2
y−
x1
f1
x1
d
x1
=
f2
*
f1
f
y
x1
f

y
x1
f1
x1
x1
f2
y
x1
f1
x1
f2
f1
(4)
This result may be extended to sums of three or more random
variables by repeated application of the above arguments for
each new variable in turn. Since convolution is a commutative
operation, for n independent variables we get:
fy=
fn
*
f
n

1
*…*
f2
*
f1
=
fn
*
f
n

1
*…*
f2
*
f1
f
y
fn
f
n

1
…
f2
f1
fn
f
n

1
…
f2
f1
(5)
An example of this effect occurs when multiple dice are thrown
and the scores are added together. In the 2dice example of
a,b,c subfigure of
(Reference) we saw how the pmf approximated a triangular
shape. This is just the convolution of two uniform 6point pmfs
for each of the two dice.
Similarly if two variables with Gaussian pdfs are added
together, we shall show in (Reference) that this produces another Gaussian pdf whose
variance is the sum of the two input variances.