Weekly Summary 5.1

Assignment Name: Weekly Summary 5.1

Course Name and Number: Data Analytics CBSC 520

Abstract

As written by S. Christian Albright and Wayne L. Winston, in this chapter we would discuss about Probability and probability essentials. We also discuss about elements of decision analysis, Identifying the problems, possible outcomes, Decision trees, One-stage decision problems, Precision tree Add-ins, multi stage decision problems, role of risk aversions, utility functions, exponential utilities and so on.

Introduction

This chapter provides a formal framework for analyzing decision problems that involve uncertainty. Our discussion includes the following: criteria for choosing among alternative decisions, how probabilities are used in the decision-making process, how early decisions affect decisions made at a later stage, how a decision maker can quantify the value of information how attitudes toward risk can affect the analysis. Throughout, we employ a powerful graphical tool a decision tree to guide the analysis. A decision tree enables a decision maker to view all important aspects of the problem at once: the decision alternatives, the uncertain outcomes and their probabilities, the economic consequences, and the chronological order of events.

Elements of Decision Analysis

Although decision making under uncertainty occurs in a wide variety of contexts, the problems we discuss in this chapter are alike in the following ways:

1. A problem has been identified that requires a solution.

2. Several possible decisions have been identified.

3. Each decision leads to several possible outcomes.

4. There is uncertainty about which outcome will occur, and probabilities of the possible outcomes are assessed.

5. For each decision and each possible outcome, a payoff is received, or a cost is incurred.

6. A “best” decision must be chosen using an appropriate decision criterion.

Identifying the problem

When something triggers the need to solve a problem, the problem that needs to be solved should be carefully identified.

Possible decisions

The possible decisions depend on how the problem is specified.

Possible outcomes

One of the main reasons why decision making under uncertainty is difficult is that decisions must be made before uncertain outcomes are revealed.

Probabilities of outcomes

There is no easy way to assess the probabilities of the possible outcomes. Sometimes they will be determined at least partly by historical data. Other estimates will necessarily contain a heavy subjective component, such as when a new product is being introduced. To complicate matters, probabilities sometimes change as more information becomes available.

Payoffs and costs

Decisions and outcomes have consequences, either good or bad, and may be monetary or nonmonetary.

Decision Criterion

Look at the worst possible outcome for each decision and choose the decision that has the best (or least bad) of these. Look at the 5th percentile of the distribution of outcomes for each decision and choose the decision that has the best of these. Look at the best possible outcome for each decision and choose the decision that has the best of these. Look at the variance of the distribution of outcomes for each decision and choose the decision that has the smallest of these. Look at the downside risk of the distribution of outcomes for each decision and choose the decision with the smallest of these. The expected monetary value, or EMV, for any decision is a weighted average of the possible payoffs for this decision, weighted by the probabilities of the outcomes. The expected monetary value criterion, or EMV criterion, is generally regarded as the preferred criterion in most decision problems. This approach assesses probabilities for each outcome of each decision and then calculates the expected payoff, or EMV, from each decision based on these probabilities.

More about the EMV Criteria

Value a decision with a given EMV the same as a sure monetary outcome with the same EMV. The EMV criterion doesn’t guarantee good outcomes. The EMV criterion is easy to operationalize in a spreadsheet. The advantage to calculating EMVs in a spreadsheet is that you can easily perform sensitivity analysis on any of the inputs.

Decision Trees

A decision problem evolves through time. A decision is made, then an uncertain outcome is observed, then another decision might need to be made, then another uncertain outcome might be observed, and so on. All the while, payoffs are being received or costs are being incurred. It is useful to show all these elements of the decision problem, including the timing, in a type of graph called a decision tree. A decision tree not only allows everyone involved to see the elements of the decision problem in an intuitive format, but it also provides a straightforward way of making the necessary EMV calculations. Decision trees are composed of nodes (circles, squares, and triangles) and branches (lines). The nodes represent points in time. A decision node (a square) represents a time when you make a decision. A probability node (a circle) represents a time when the result of an uncertain outcome becomes known. An end node (a triangle) indicates that the problem is completed all decisions have been made, all uncertainty has been resolved, and all payoffs and costs have been incurred. (When people draw decision trees by hand, they often omit the actual triangles. However, we still refer to the right-hand tips of the branches as the end nodes.) Time proceeds from left to right. This means that any branches leading into a node (from the left) have already occurred. Any branches leading out of a node (to the right) have not yet occurred. Branches leading out of a decision node represent the possible decisions; you get to choose the preferred branch. Branches leading out of probability nodes represent the possible uncertain outcomes; you have no control over which of these will occur. Probabilities are listed on probability branches. These probabilities are conditional on the events that have already been observed (those to the left). Also, the probabilities on branches leading out of any probability node must sum to 1. Monetary values are shown to the right of the end nodes.

One-Stage Decision Problems

Many decision problems are similar to the simple decision problem discussed in the previous section. You decide, then you wait to see an uncertain outcome, and a payoff is received, or a cost is incurred. We refer to these as single-stage decision problems because you make only one decision, the one right now.

The Precision Tree Add-In

Decision trees present a challenge for Excel. Precision Tree, a powerful add-in developed by Palisade Corporation, makes the process relatively straight forward. It enables you to draw and label a decision tree. It performs the folding-back procedure automatically. It allows you to perform sensitivity analysis on key input parameters. See your text for a detailed description of its use.

Multistage Decision Problems

Many real-world decision problems evolve through time in stages. A company first decides. Then it observes an uncertain outcome that provides some information. Based on this information, the company then makes another decision. Then it observes another uncertain outcome. The objective is again to maximize EMV, but now we are searching for an EMV-maximizing strategy, often called a contingency plan, that specifies which decision to make at each stage. A contingency plan tells the company which decision to make at the first stage, but the company won’t know which decision to make at the second stage until the information from the first uncertain outcome is known. An important aspect of multistage decision problems is that probabilities can change through time. Specifically, after you receive the information from the first-stage uncertain outcome, you might need to reassess the probabilities of future uncertain outcomes. Another important aspect of multistage decision problem is the value of information. Sometimes the first-stage decision is to buy information that will help in making the second-stage decision. The question then is how much this information is worth. n a decision-making context, information is usually bought to reduce the uncertainty about some outcome. The expected value of information is the amount a firm would be willing to pay for information and is given by the formula: EVI = EMV with (free) information – EMV without information.

Although the calculation of EVI is straightforward once the decision tree has been created, the decision tree itself requires a lot of probability assessments and Bayes’ rule calculations. Therefore, it is sometimes useful to ask how much any information could be worth, regardless of its form or accuracy. The result is called the expected value of perfect information, or EVPI and is given by the equation EVPI = EMV with (free) perfect information – EMV without information.

A strategy region graph shows how the EMV varies with the conditions, for example, whether Acme hires a marketing firm. This type of chart is useful for seeing whether the optimal decision changes over the range of the input variable.

The Role of Risk Aversion

Rational decision makers are sometimes willing to violate the EMV maximization criterion when large amounts of money are at stake. These decision makers are willing to sacrifice some EMV to reduce risk. Most researchers believe that if certain basic behavioral assumptions hold, people are expected utility maximizers that is, they choose the alternative with the largest expected utility.

Utility Functions

Utility function is a mathematical function that transforms monetary values payoffs and costs into utility values. An individual’s utility function specifies the individual’s preferences for various monetary payoffs and costs and, in doing so, it automatically encodes the individual’s attitudes toward risk. Most individuals are risk averse, which means intuitively that they are willing to sacrifice some EMV to avoid risky gambles. Expected utility maximization There are two aspects of implementing expected utility maximization in a real decision analysis. First, an individual’s (or company’s) utility function must be assessed. Second, the resulting utility function is used to find the best decision.

Exponential Utility

An exponential utility function has only one adjustable numerical parameter, called the risk tolerance. There are straightforward ways to discover an appropriate value of this parameter for a individual or company, so it is relatively easy to assess. An exponential utility function has the following form:

U(x) = 1-e^-x/R

The risk tolerance for an exponential utility function is a single number that specifies an individual’s aversion to risk. The higher the risk tolerance, the less risk averse the individual is.

Is Expected Utility Maximization Used?

Expected utility maximization is an involved task. Theoretically, it might be interesting to researchers. However, in the business world, it is not used very often. Risk aversion has been found to be of practical concern in only 5% to 10% of business decision analyses. It is often adequate to use expected value (EMV) for most decisions.

References

S. Christian Albright/Wayne L. Winston (2017). Business Analytics – Data Analysis and Decision Making. Cengage Learning

ISBN: 978-1-305-94754-2

Place an Order

Plagiarism Free!

Scroll to Top