Long term strategic planning and decision-making is crucial in any organization. However, we are not living in a world with crystal ball that we have perfect information for future. Risk and uncertainty play an important role in the planning and decision-making process.
In ACCA Advanced Performance Management, you need to understand the impact of risk and uncertainty in strategic planning and decision-making. It is quite often to see questions in exam asking students to explain decision-making under different conditions. In this article, you will be firstly told about decision theory, then followed by decision-making under different states in details.
Introduction to Decision-Making
Business managers make organizational decisions in several conditions and scenarios. Decision-making fundamentals rest on information available, choices, and outcomes of the decision. The decision-making process begins with evaluating choices and preferences, analyzing the probable outcomes under a specific condition.
Broadly, the decision-making process will involve decision-making conditions related to information, preferences, and alternatives. The decision-making process can be rational or irrational. It can also be classified under several types depending on the underlying approach of the decision-maker. The decision-making will depend broadly on factors like:
The explicit knowledge or information available for the decision-maker
Decision-maker’s cognitive behavior
The conditions under which the decision is being made, the amount of certainty and risks attached to the decision
The decision-making approach of the decision-maker
The decision-making is an approach towards problem solving. Numerous decision-making studies and approaches define the effects and results of the decision choices.
Business managers (the Decision-makers) often face uncertain conditions when evaluating options and alternatives. The process of decision-making under uncertainty by evaluating different choices is the simplest definition of the decision theory.
The psychological study of decision-making also includes the preferences and options perspective. The decision-maker’s preference can be rational or irrational; the options can be made under perfect and imperfect information. The decision theory focuses on the argument that impacts of uncertainty can be reduced with repetitive experiments or the so-called use of probability.
The decision theory evaluating decision process can be classified under three conditions:
Decisions with Certainty: When the decision-maker can evaluate different options with certain outcomes. The information to make decisions is available. These decisions often lead to rational, measurable, and accurate decision-making process.
Decisions with Un-certainty: When the decision-maker doesn’t have access to accurate and measureable information to make the decision. The outcomes of different options are unclear or uncertain. The decision-maker uses the expected value or probability methods to evaluate the best options.
Decision-making with Risks: A situation where the decision-maker analyzes the potential outcomes of different choices under certain or uncertain conditions. The situation often occurs due to a lack of perfect or measurable information.
The decision theory describes the decision-making framework under two models of decision-making.
The Normative Decision-making
This decision-making model describes the perfect set of rules for the decision-maker. The decision should be based on rational and fully informative preferences. The theoretical approach aims to find a perfect decision-making method by evaluating certain and perfect decision options with known outcomes.
The Descriptive Decision-making
This model describes the decision-making under certain rules and factors that affect the decision-maker. In a practical world, many outside factors affect the decision-making process. This model states that not all factors affecting decision-making can be internal and controlled. It links to the probability of events occurring in real terms than in a theoretical framework.
The Expected Value Model
Decision-making under uncertainty cannot be measured in quantifiable measures with absolute terms. The Expected Value model weighs on the average outcomes of a decision-making process with repeated decisions with changing outcomes.
The importance of the expected value model relies on the average or weighted average variables and the degree of repetition or trial and error. Statistically, if we denote each outcome with a value of X and the probability of the outcome with p, then the formula for expected value EV would become:
EV = ∑ PX
The expected value model can be used to evaluate decision outcomes with perfect and imperfect information.
The Perfect Information
When all possible outcomes of a decision are known with 100% certainty, it’s called perfect information. The perfect information will make future predictions or outcome correct every time. In a practical world, perfect information is often assisted with additional information.
The Imperfect Information
When the degree of input or variables is unknown, and it makes the outcome of a decision-making process uncertain, it’s called the imperfect information. The future predictions can be made with precision but not with certainty.
The value of information can be calculated as the difference of expected value with or without the information for both cases.
Expected Value Model under Perfect Information Scenario
Expected values are the weighted averages of the probable outcomes. The perfect information (aided with additional information) can still hold the uncertainty of the outcome. A simple example of the expected value with perfect information can be of rolling dice. It has three even and three odd sides. The probability of each side turning up is 50%. If we assign the value of even number with $ 10 (as a win) and odd number with $ 5 (as a loss), then we can calculate the Expected value as:
EV = 50% × (10) + 50% × (-5) = 5 – 2.50 = $ 2.50
This is a simple example; the decision-making under expected value would require a repetitive trial under the same information and then taking a mean average of the outcomes to reach the final decision.
The value of information is simply the difference between the outcome or expected value of a decision with and without the information.
Suppose an Ice Cream seller predicts weather with three probabilities and decides the customer orders optimum based on the expected value. We have the following information:
For each type of orders, the expected values can be calculated as:
Expected Value of Small orders = (0.2 × 250) + (0.5 × 200) + (0.3×150) = $ 195
Expected Value of Medium orders= (0.2 × 200) + (0.5 × 500) + (0.3×300) = $ 380
Expected Value of Large orders = (0.2 × 100) + (0.5 × 300) + (0.3×750) = $ 395
The optimum of the probability would suggest going with large order with an expected value of $ 395. The value of perfect information here can be calculated as:
Expected Value = (0.2 × 250) + (0.5×500) + (0.3×750) = $ 525
In each case, we calculated the probability with the maximum possible outcome to get the maximum expected value. Thus the value of information becomes:
Value of Perfect Information= $ 525 - $ 395 = $ 130
Expected Value Model under Imperfect Information Scenario
In an imperfect decision the outcome of decision cannot be forecasted with 100% accuracy. The probability of an even occurring may or may not add up to 100% of the total value. The imperfect information does not eliminate the uncertainty (or cannot); rather it changes the outcome or degree of the uncertainty.
Theoretically, the value of imperfect information should be less than the value of perfect information as it is less reliable. The use of past data or additional information can be helpful in decision-making under imperfect information. The value of decision-making with information (perfect or imperfect) will always be greater than the value without information.
Continuing with our example above if we have imperfect information, the forecast is for hot weather, and assigned probabilities change to:
Weather Probability :
Cold (0.3) - forecast hot but weather cold
Warm (0.4) - forecast hot but weather warm
Hot (0.7) - forecast hot and weather hot
Total probability does not add up to 1.0, as the information is imperfect.
We can assign the probability of weather against 100 days period. Our past perfect information expecting a 20 days cold, 30 days warm, and 50 days hot weather. calculate the imperfect probabilities.
Keeping in mind our weather forecasted with perfect information:
Weather Probability = Cold (0.2) Warm (0.5) Hot (0.3) = 1.0 total probability
If the imperfect information predicts the weather is hot, then the probabilities for all three conditions changes:
Probability of Cold = 6/47 = 0.128
Probability of Warm = 20/47 = 0.425
Probability of Hot = 21/47 = 0.447
The expected values for the selling orders would become:
Expected Value of Small orders = (0.128 × 250) + (0.425 × 200) + (0.447×150) = $ 184
Expected Value of Medium orders= (0.128 × 200) + (0.425 × 500) + (0.447×300) = $ 372
Expected Value of Large orders = (0.128 × 100) + (0.425 × 300) + (0.447×750) = $ 476
The optimum of selling orders with imperfect information becomes $ 476.
The value of Imperfect information = $ 476 - $ 395 = $ 81
The value of imperfect information is less than the value of perfect information.
Limitations of the Expected Value model in Decision-making
The expected value model can be usefully implemented with both perfect and imperfect information scenarios. If the probabilities can be assigned with a degree of certainty it can be used to calculate the expected value of the outcome and value of information.
However, the expected value model due to its generalization offers some limitations for decision-makers:
It ignores the risk factors associated with decisions under different scenarios
It relies on repetitive or trial and error averages of outcomes to assign the expected values which may not still provide accurate information
The calculated expected values may still provide different results from actual outcomes
Imperfect information still needs to be adjusted with past data or probabilities
Starting to introduce decision theory, we further elaborate how decision-making under different scenarios, which are perfect information and imperfect information, by using expected value.
Decision-making is often seen in ACCA APM exam while you should well prepare of it. In July 2020 exam, a question asked if an approach taken for a “one off” decision would be appropriate for longer-term decision making. The performance is disappointing as many students are lack of technical knowledge in decision making under certain conditions and states. The concept in expected value, perfect and imperfect information are required before attempting the question liked this.
In future, decision-making will still be an important part of ACCA APM exam. Be prepared for it.
If you find this article is helpful and you want to help others too, just share it in any social media (such as Facebook, LinkedIn).