Skip to content Skip to navigation

OpenStax-CNX

You are here: Home » Content » MATLAB Calculations for Decision Models

Navigation

Lenses

What is a lens?

Definition of a lens

Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

Who can create a lens?

Any individual member, a community, or a respected organization.

What are tags? tag icon

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

This content is ...

Affiliated with (What does "Affiliated with" mean?)

This content is either by members of the organizations listed or about topics related to the organizations listed. Click each link to see a list of all content affiliated with the organization.
  • Rice Digital Scholarship

    This module is included in aLens by: Digital Scholarship at Rice UniversityAs a part of collection: "Topics in Applied Probability"

    Click the "Rice Digital Scholarship" link to see all content affiliated with them.

  • NSF Partnership display tagshide tags

    This module is included inLens: NSF Partnership in Signal Processing
    By: Sidney BurrusAs a part of collection: "Topics in Applied Probability"

    Click the "NSF Partnership" link to see all content affiliated with them.

    Click the tag icon tag icon to display tags associated with this content.

  • Featured Content display tagshide tags

    This module is included inLens: Connexions Featured Content
    By: ConnexionsAs a part of collection: "Topics in Applied Probability"

    Click the "Featured Content" link to see all content affiliated with them.

    Click the tag icon tag icon to display tags associated with this content.

Also in these lenses

  • UniqU content

    This module is included inLens: UniqU's lens
    By: UniqU, LLCAs a part of collection: "Topics in Applied Probability"

    Click the "UniqU content" link to see all content selected in this lens.

Recently Viewed

This feature requires Javascript to be enabled.

Tags

(What is a tag?)

These tags come from the endorsement, affiliation, and other lenses that include this content.
 

MATLAB Calculations for Decision Models

Module by: Paul E Pfeiffer. E-mail the author

Summary: Additional examples of Matlab calculations for decision problems (see Matlab Procedures for Markov Decision Processes).

Data

There are three types.  In all types, we need the following:

A = the vector of actions ( 1 × m ) m = the number of actions P H : P H ( i ) = P ( H = u i ) ( 1 × s ) s = the number of values of H P X H : P X H ( i , j ) = P ( X = x j | H = u i ) ( s × q ) q = the number of values of X A = the vector of actions ( 1 × m ) m = the number of actions P H : P H ( i ) = P ( H = u i ) ( 1 × s ) s = the number of values of H P X H : P X H ( i , j ) = P ( X = x j | H = u i ) ( s × q ) q = the number of values of X

  • Type 1:  The usual type.  In addition to the above, we need      L=[L(a,yk)](m×n)m=thenumberofactionsPYH:PYH(i,k)]=P(Y=yk|H=ui)(s×n)n=thenumberofvaluesofYL=[L(a,yk)](m×n)m=thenumberofactionsPYH:PYH(i,k)]=P(Y=yk|H=ui)(s×n)n=thenumberofvaluesofY
  • Type 2:  The matrix RH=[r(a,i)]RH=[r(a,i)] is given.  L and PYHPYH are not needed.
  • Type 3:  Sometimes Y=HY=H.  In this case RH=LRH=L, which we need, in addition to the above.

Calculated quantities

  1. RH=[r(a,i)](m×s)RH=[r(a,i)](m×s)      [Risk function = expected loss, given H]                r(a,i)=E[L(a,Y)|H=ui]=kL(a,k)P(Y=yk|H=ui)r(a,i)=E[L(a,Y)|H=ui]=kL(a,k)P(Y=yk|H=ui) MATLAB:  RH = L*PYH'
  2. PX(1×q)PX(1×q)                PX(j)=P(X=xj)=iP(H=ui)P(X=j|H=ui)PX(j)=P(X=xj)=iP(H=ui)P(X=j|H=ui) MATLAB:  PX = PH*PXH
  3. PHX(q×s)PHX(q×s)                PHX(i,)=P(H=uj|X=xi)=P(X=xi|H=ui)P(H=uj)/P(X=Xi)PHX(i,)=P(H=uj|X=xi)=P(X=xi|H=ui)P(H=uj)/P(X=Xi) MATLAB: [a,b] = meshgrid(PH,PX)      PHX = PXH'.*a./b
  4. RX=[R(a,j)](m×q)RX=[R(a,j)](m×q)       [Expected risk, given X]                R(a,j)=E[r(a,H)|X=xj]=ir(a,i)P(H=ui|X=xj)R(a,j)=E[r(a,H)|X=xj]=ir(a,i)P(H=ui|X=xj) MATLAB:  RX = RH*PHX'
  5. Select d* from RXRX:  d*(j)d*(j) is the action a (row number) for minimum expected loss, given X=jX=j. Set D=[d*(1),d*(2),d*(q)]D=[d*(1),d*(2),d*(q)].
  6. Calculate the Bayesian risk BDBD for d*.                BD=E[R(d*(X),X)]=jRX(D(j),j)PX(j)BD=E[R(d*(X),X)]=jRX(D(j),j)PX(j) MATLAB:  RD*PX'

Note:

Actions are represented in calculations by action number (position in the matrix). In some cases, each action has a value other than its position number. The actual values can be presented in the final display.

file dec.m

% file dec.m
% Version of 12/12/95
disp('Decision process with experimentation')
disp('There are three types, according to the data provided.')
disp('In all types, we need the row vector A of actions,')
disp('the row vector PH with PH(i) = P(H = u_i),')
disp('the row vector X of test random variable values, and')
disp('the matrix PXH with PXH(i,j) = P(X = x_j|H = u_i).')
disp('Type 1.  Loss matrix L of L(a,k)')
disp('         Matrix PYH with PYH(i,k) = P(Y = y_k|H = u_i)')
disp('Type 2.  Matrix RH of r(a,i) = E[L(a,Y)|H = u_i].')
disp('         L and PYH are not needed for this type.')
disp('Type 3.  Y = H, so that only RH = L is needed.')
c   = input('Enter type number  ');
A   = input('Enter vector A of actions ');
PH  = input('Enter vector PH of parameter probabilities  ');
PXH = input('Enter matrix PXH of conditional probabilities  ');
X   = input('Enter vector X of test random variable values  ');
s = length(PH);
q = length(X);
if c == 1
 L   = input('Enter loss matrix L  ');
 PYH = input('Enter matrix PYH of conditional probabilities  ');
 RH  = L*PYH';
elseif c == 2
 RH  = input('Enter matrix RH of expected loss, given H  ');
else
 L   = input('Enter loss matrix L  ');
 RH  = L;
end
PX   = PH*PXH;        % (1 x s)(s x q) = (1 x q)
[a,b] = meshgrid(PH,PX);
PHX = PXH'.*a./b;     % (q x s)
RX  = RH*PHX';        % (m x s)(s x q) = (m x q)
[RD D] = min(RX);     % determines min of each col
                      % and row on which min occurs
S = [X; A(D); RD]';
BD = RD*PX';          % Bayesian risk
h  = ['  Optimum losses and actions'];
sh = ['  Test value  Action     Loss'];
disp(' ')
disp(h)
disp(sh)
disp(S)
disp(' ')
disp(['Bayesian risk  B(d*) = ',num2str(BD),])

Example 1: General case

% file dec1.m
% Data for Problem 22-11
type = 1;
A = [10 15];          % Artificial actions list
PH = [0.3 0.2 0.5];   % PH(i) = P(H = i)
PXH = [0.7 0.2 0.1;   % PXH(i,j) = P(X = j|H= i)
      0.2 0.6 0.2;
      0.1 0.1 0.8];
X = [-1 0  1];
L = [1  0 -2;         % L(a,k) = loss when action number is a, outcome is k
    3 -1 -4];
PYH = [0.5 0.3 0.2;   % PYH(i,k) = P(Y = k|H = i)
      0.2 0.5 0.3;
      0.1 0.3 0.6];
 
dec1
dec
Decision process with experimentation
There are three types, according to the data provided.
In all types, we need the row vector A of actions,
the row vector PH with PH(i) = P(H = i),
the row vector X of test random variable values, and
the matrix PXH with PXH(i,j) = P(X = j|H = i).
Type 1.  Loss matrix L of L(a,k)
        Matrix PYH with PYH(i,k) = P(Y = k|H = i)
Type 2.  Matrix RH of r(a,i) = E[L(a,Y)|H = i].
        L and PYH are not needed in this case.
Type 3.  Y = H, so that only RH = L is needed.
Enter type number  type
Enter vector A of actions A
Enter vector PH of parameter probabilities  PH
Enter matrix PXH of conditional probabilities  PXH
Enter vector X of test random variable values  X
Enter loss matrix L  L
Enter matrix PYH of conditional probabilities  PYH
 
 Optimum losses and actions
 Test value  Action     Loss
  -1.0000   15.0000   -0.2667
        0   15.0000   -0.9913
   1.0000   15.0000   -2.1106
 
Bayesian risk  B(d*) = -1.3

Intermediate steps in solution of Example 1, to show results of various operations

RH
RH  =  0.1000   -0.4000   -1.1000
      0.4000   -1.1000   -2.4000
PX
PX  =  0.3000    0.2300    0.4700
a
a   =  0.3000    0.2000    0.5000
      0.3000    0.2000    0.5000
      0.3000    0.2000    0.5000
b
b   =  0.3000    0.3000    0.3000
      0.2300    0.2300    0.2300
      0.4700    0.4700    0.4700
PHX
PHX =  0.7000    0.1333    0.1667
      0.2609    0.5217    0.2174
      0.0638    0.0851    0.8511
RX
RX  = -0.1667   -0.4217   -0.9638
     -0.2667   -0.9913   -2.1106

Example 2: RHRH given

% file dec2.m  
% Data for type in which RH is given
type = 2;
A = [1 2];
X = [-1 1 3];
PH = [0.2 0.5 0.3];
PXH = [0.5 0.4 0.1;   % PXH(i,j) = P(X = j|H = i)
      0.4 0.5 0.1;
      0.2 0.4 0.4];
RH = [-10   5 -12;
       5 -10  -5];    % r(a,i) = expected loss when
                      %   action is a, given H = i
 
dec2
dec
Decision process with experimentation
------------------- Instruction lines edited out
Enter type number  type
Enter vector A of actions A
Enter vector PH of parameter probabilities  PH
Enter matrix PXH of conditional probabilities  PXH
Enter vector X of test random variable values  X
Enter matrix RH of expected loss, given H  RH
 
 Optimum losses and actions
 Test value  Action     Loss
  -1.0000    2.0000   -5.0000
   1.0000    2.0000   -6.0000
   3.0000    1.0000   -7.3158
 
Bayesian risk  B(d*) = -5.89

Example 3: Example 3

Carnival example (type in which Y=HY=H)

A carnival is scheduled to appear on a given date.  Profits to be earned depend heavily on the weather.  If rainy, the carnival loses $15 (thousands); if cloudy, the loss is $5 (thousands); if sunny, a profit of $10 (thousands) is expected.  If the carnival sets up its equipment, it must give the show;  if it decides not to set up, it forfeits $1,000.  For an additional cost of $1,000, it can delay setup until the day before the show and get the latest weather report.

      Actual weather H=YH=Y is 1 rainy, 2 cloudy, or 3 sunny.

The weather report X has values 1, 2, or 3, corresponding to predictions rainy, cloudy, or sunny respectively.

      Reliability of the forecast is expressed in terms of P(X=j|H=i)P(X=j|H=i)– see matrix PXHPXH

      Two actions: 1 set up;  2 no set up.

      Possible losses for each action and weather condition are in matrix L.

% file dec3,m
% Carnival problem
type = 3;             % Y = H  (actual weather)
A = [1  2];           % 1: setup  2: no setup
X = [1  2  3];        % 1; rain,  2: cloudy, 3: sunny
L = [16 6 -9;         % L(a,k) = loss when action number is a, outcome is k
     2 2  2];         % --with premium for postponing setup
PH = 0.1*[1 3 6];     % P(H = i)
PXH = 0.1*[7 2 1;     % PXH(i,j) = P(X = j|H = i)
          2 6 2;
          1 2 7];
 
dec3
dec
Decision process with experimentation
------------------- Instruction lines edited out
Enter case number  case
Enter vector A of actions A
Enter vector PH of parameter probabilities  PH
Enter matrix PXH of conditional probabilities  PXH
Enter vector X of test random variable values  X
Enter loss matrix L  L
 
 Optimum losses and actions
 Test value  Action     Loss
   1.0000    2.0000    2.0000
   2.0000    1.0000    1.0000
   3.0000    1.0000   -6.6531
 
Bayesian risk  B(d*) = -2.56

Content actions

Download module as:

Add module to:

My Favorites (?)

'My Favorites' is a special kind of lens which you can use to bookmark modules and collections. 'My Favorites' can only be seen by you, and collections saved in 'My Favorites' can remember the last module you were on. You need an account to use 'My Favorites'.

| A lens I own (?)

Definition of a lens

Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

Who can create a lens?

Any individual member, a community, or a respected organization.

What are tags? tag icon

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

| External bookmarks