Skip to content Skip to navigation

OpenStax_CNX

You are here: Home » Content » Differential Entropy

Navigation

Recently Viewed

This feature requires Javascript to be enabled.
 

Differential Entropy

Module by: Anders Gjendemsjø. E-mail the author

Summary: In this module we consider differential entropy.

Consider the entropy of continuous random variables. Whereas the (normal) entropy is the entropy of a discrete random variable, the differential entropy is the entropy of a continuous random variable.

Differential Entropy

Definition 1: Differential entropy
The differential entropy hXhX of a continuous random variable XX with a pdf fxfx is defined as
hX=fxlogfxdx h X x fx fx
(1)
Usually the logarithm is taken to be base 2, so that the unit of the differential entropy is bits/symbol. Note that is the discrete case, hXhX depends only on the pdf of XX. Finally, we note that the differential entropy is the expected value of logfx f x , i.e.,
hX=Elogfx h X E f x
(2)

Now, consider a calculating the differential entropy of some random variables.

Example 1

Consider a uniformly distributed random variable XX from cc to c+Δ c Δ . Then its density is 1Δ 1 Δ from cc to c+Δ c Δ , and zero otherwise.

We can then find its differential entropy as follows,

hX=cc+Δ1Δlog1Δdx=logΔ h X x c cΔ 1Δ 1Δ Δ
(3)
Note that by making ΔΔ arbitrarily small, the differential entropy can be made arbitrarily negative, while taking ΔΔ arbitrarily large, the differential entropy becomes arbitrarily positive.

Example 2

Consider a normal distributed random variable XX, with mean mm and variance σ2σ2. Then its density is 12πσ2exm22σ2 1 2 σ 2 e xm 2 2 σ2 .

We can then find its differential entropy as follows, first calculate logfxfx:

logfx=12log(2πσ2)+logexm22σ2 fx 1 2 2 σ 2 e xm 2 2 σ2
(4)
Then since EXm2=σ2 E X m 2 σ 2 , we have
hX=12log(2πσ2)+12loge=12log(2πeσ2) h X 1 2 2 σ 2 1 2 e 1 2 2 e σ 2
(5)

Properties of the differential entropy

In the section we list some properties of the differential entropy.

  • The differential entropy can be negative
  • hX+c=hX h X c h X , that is translation does not change the differential entropy.
  • haX=hX+log|a| h a X h X a , that is scaling does change the differential entropy.
The first property is seen from both Example 1 and Example 2. The two latter can be shown by using Equation 1.

Content actions

Download module as:

Add module to:

My Favorites (?)

'My Favorites' is a special kind of lens which you can use to bookmark modules and collections. 'My Favorites' can only be seen by you, and collections saved in 'My Favorites' can remember the last module you were on. You need an account to use 'My Favorites'.

| A lens I own (?)

Definition of a lens

Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

Who can create a lens?

Any individual member, a community, or a respected organization.

What are tags? tag icon

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

| External bookmarks