Application Center - Maplesoft

App Preview:

Decision Analysis using Bayes Rule

You can switch back to the summary page by clicking here.

Learn about Maple
Download Application


 

>
 

Decision Analysis using Bayes Rule 

The process of revising prior probabilities (prior-->posterior) to include sample information is known as Bayes rule. 

 

Initialize 

If you use "execute the entire worksheet" (!!!) shown above, it will execute faster. 

> Typesetting:-mrow(Typesetting:-mi(
 

Bayes 

> Typesetting:-mrow(Typesetting:-mi(
Typesetting:-mrow(Typesetting:-mi(
Typesetting:-mrow(Typesetting:-mi(
Typesetting:-mrow(Typesetting:-mi(
Typesetting:-mrow(Typesetting:-mi(
Typesetting:-mrow(Typesetting:-mi(
Typesetting:-mrow(Typesetting:-mi(
Typesetting:-mrow(Typesetting:-mi(
Typesetting:-mrow(Typesetting:-mi(
Typesetting:-mrow(Typesetting:-mi(
 

> Typesetting:-mrow(Typesetting:-mi(
Typesetting:-mrow(Typesetting:-mi(
 

> p1;
 

Sample Size (1)
 

> p2;
 

Typesetting:-mrow(Typesetting:-mverbatim( (2)
 

> p3;
 

Typesetting:-mrow(Typesetting:-mverbatim( (3)
 

> p4;
 

Typesetting:-mrow(Typesetting:-mverbatim( (4)
 

> p5;
 

EPPI (5)
 

> p6;p7;p21;p22;
 

 

 

 

EPNS
EPNS is the maximum payoff before sampling.
EPNS which row is maximum?`
Choose payoff this row as the best option before sampling (6)
 

> p8;p23;
 

 

EVPI
EVPI is the maximum one should be willing to pay for perfect sampling information.  Rarely is perfect information available.
EVPI is the maximum one should be willing to pay for perfect sampling information.  Rarely is perfect information available.
(7)
 

> p9;p29;
 

 

SPI(marginal probabilities)
SPI is the sum of each joint probability row (8)
 

> p10;
 

Typesetting:-mrow(Typesetting:-mverbatim( (9)
 

> p11;
 

Typesetting:-mrow(Typesetting:-mverbatim( (10)
 

> p12;p30;
 

 

prior probabilities
Prior probabilities are the sum of each column of the joint probabilities (11)
 

> p13;p24;p25;
 

 

 

Typesetting:-mrow(Typesetting:-mverbatim(
The posterior probabilities are the revised prior probabilities AFTER the sampling information is used in the calculations.
The posterior probabilities are the revised prior probabilities AFTER the sampling information is used in the calculations.
The process of revising prior (prior-->posterior) probabilities to include sample information is known as Bayes rule. (12)
 

> p14;p26;p15;
 

 

 

Typesetting:-mrow(Typesetting:-mverbatim(
EP (Expected Payoff) is sum of the payoff times the posterior probability.  Find the maximum in each row of EP.  The column in which the maximum is located equals the payoff row (after sampling) one ...
EP (Expected Payoff) is sum of the payoff times the posterior probability.  Find the maximum in each row of EP.  The column in which the maximum is located equals the payoff row (after sampling) one ...
EP (Expected Payoff) is sum of the payoff times the posterior probability.  Find the maximum in each row of EP.  The column in which the maximum is located equals the payoff row (after sampling) one ...
The best payoff (13)
 

> p16;p27;
 

 

EPS
EPS is the maximum Expected Payoff after Sampling using posterior probabilities. (14)
 

> p17;p18;
 

 

EVSI
EVSI is the gain from sampling.  Be reluctant to exceed this figure in the cost of sampling and sample information. (15)
 

> p20;p19;p28;
 

 

 

 ENGS
ENGS  (Expected Net Gain for Sampling = EVSI minus any sampling costs) is important.  As long as ENGS is positive, the company can expect to gain by obtaining the sample information.
ENGS  (Expected Net Gain for Sampling = EVSI minus any sampling costs) is important.  As long as ENGS is positive, the company can expect to gain by obtaining the sample information.
Conclusion:  Since ENGS<0, don't pay for sampling (16)
 

> p31;p32;
 

 

Efficiency
Efficiency (EVSI(100)/EVPI %) is a measure of goodness of fit of a statistical procedure for sample. (17)
 

>
 

 

>