• Buy (now) a fully-flexible ticket: This would generally be a more expensive option than purchasing a non-flexible ticket, but at least trip-cancellation insurance would not be required, and the cost would have been fixed.
• Delay the purchase of the ticket until the dates are fixed with more certainty; at that future point, make a final decision as to whether to buy a fixed-date ticket at that point or to purchase one that is flexible, and possibly also with trip-cancellation insurance.
• One could think of an even wider set of decision options of a more structural nature that are fundamentally different to the originally planned actions, and which nevertheless aim to achieve the desired objectives; for example, one may conduct a series of video conferences coupled with electronic document sharing, instead of having an in-person meeting.
When planning a major building or renovation project (for example, of an old apartment that one has just bought), one may estimate a base budget for the works and then add some contingency to cover “unexpected” issues: these could include that materials or labour costs may be higher than expected, or that asbestos would be discovered in currently hidden (or inaccessible) wall or ceiling cavities, or that supporting structures would not be as solid as expected, and so on. This process would result in a revised figure that may be sufficient to cover the total project costs even when several of the risks materialise. If this revised budget is covered by available funds, one would presumably proceed with the project as originally conceived. However, if this revised budget exceeds the funds available, one may have to develop further decision options, such as:
• To continue the project as originally planned and “hope for the best” (whilst potentially looking for other possible mitigation measures, such as borrowing money from a family member if required, and taking in a lodger to repay the borrowings more quickly).
• To re-scope the project (e.g. use less expensive finishings).
• To restructure the project into phases (e.g. delay for several years the renovation of the spare bathroom until more funds are available).
• To cancel the project entirely.
When planning to travel from home to the airport, if one has already conducted such a journey many times, one would know from experience how much travel time to allow: this “base case” plan would implicitly already take into account to some extent that there may be unforeseen events that can materialise en route. In other words, the base plan would have some contingency (time) built in. On the other hand, where the journey is new (e.g. one has recently moved into the area), one may do some explicit research to estimate the base journey time, and then perhaps add some extra contingency time as well.
When planning a journey that will be undertaken with another person, each person's desired contingency time would typically be different to the other's: each will have different tolerances for risk, with both their perceived cost of excess waiting time (e.g. at the airport) and the implications to them of missing the plane being different.
Of course, in general, these informal processes can be very valuable; indeed they may often be sufficient to ensure than an adequate decision is taken. In other cases, they will be insufficient.
Clearly, in both the public and private sectors there have been many projects in which significant unexpected delays or cost overruns occurred, most especially in the delivery of major infrastructure, transportation and construction projects. An example (chosen only as it appeared in the general press around the time of the writing of this text) was the project to deliver a tramway in Edinburgh (Scotland), which was due to cost around £400 million when announced in 2003, but rose to around £800 million by the date of project completion in 2014.
In fact, it is probably fair to say that most failures (and many successes) of risk management in business contexts are not publicly observable, for many reasons, including:
• They are of a size that does not impact the aggregate business performance in a meaningful way (even if the amounts concerned may be substantial by the standards of ordinary individuals), and the losses are absorbed within a general budget.
• They are not openly discussed, and the failure is not objectively investigated (nor the results made public).
• It is challenging to demonstrate that risks that did materialise could and should have been mitigated earlier: in other words to distinguish the “benefits of hindsight” from what should reasonably have been known earlier in the process.
However, occasionally there have been major cases that have been of sufficient size and public importance that their causes have been investigated in detail; some of these are briefly discussed below:
• The Financial Crisis. The financial crisis of the early 21st century led to the creation of a Financial Crisis Enquiry Commission, whose role was to establish the causes of the crisis in the United States. Although its report, published in January 2012, runs to hundreds of pages, some key conclusions were:
• “… this financial crisis was avoidable … the result of human action and inaction, not of Mother Nature or computer models gone haywire. The captains of finance and the public stewards of our financial system ignored warnings, and failed to question, understand, and manage evolving risks.”
• “Despite the view of many … that the crisis could not have been foreseen … there were warning signs. The tragedy was that they were ignored or discounted.”
• “Dramatic failures of corporate governance and risk management at many systemically important financial institutions were a key cause of this crisis …”
• The Deepwater Horizon Oil Spill. In April 2010, the Macondo oil well being drilled in the Gulf of Mexico suffered a severe blowout, costing the lives of 12 men, and resulting in the spillage of millions of barrels of crude oil. This disrupted the region's economy, damaged fisheries and habitats, and led to BP's having to pay large sums in compensation and damages. A commission was set up by President Obama to investigate the disaster, its causes and effects, and recommend the actions necessary to minimise such risks in the future. The Report to the President, issued in January 2012, runs into several hundred pages. Some key conclusions include:
• “The loss … could have been prevented.”
• “The immediate causes … a series of identifiable mistakes … that reveal … systematic failures in risk management.”
• “None of [the] decisions … in Figure 4.10 [Examples of Decisions that Increased Risk at Macondo while Potentially Saving Time] appear to have been subject to a comprehensive and systematic risk-analysis, peer-review, or management of change process.”
• Columbia Space Shuttle. On 1 February 2003, space shuttle Columbia broke up as it returned to Earth, killing the seven astronauts on board. The Accident Investigation Board reported in August 2003, and showed that a large piece of foam fell from the shuttle's external tank on re-entry, which breached the spacecraft wing. The report also noted that:
• The problem … was well known and had caused damage on prior flights; management considered it an acceptable risk.
• “… the accident was probably not an anomalous, random event, but rather likely rooted … in NASA's history and … culture.”
• “Cultural traits and organizational practices detrimental to safety were allowed to develop, including … a reliance on past success as a substitute for sound engineering … [and] … organizational barriers that prevented effective communication and stifled professional differences of opinion.”
1.2 General Challenges in Decision-Making Processes
This section covers some of the general or contextual challenges in decision-making processes, including that of achieving an appropriate balance between rational considerations and intuition, as well as the possibility of the