5.21 Perform Quantitative Risk Analysis

5.21 Perform Quantitative Risk Analysis
Inputs Tools & Techniques Outputs

Replace this with term.

Purpose & When to Use

  • Estimate the likelihood of meeting key dates and budgets using numbers, not just ratings.
  • Size cost and schedule contingency based on modeled uncertainty.
  • Identify which risks and assumptions drive the most variance.
  • Compare alternatives or response strategies using expected value and scenarios.
  • Use after qualitative risk prioritization, and on projects that are large, complex, or have tight targets or high uncertainty.
  • Repeat at major milestones or after significant scope, estimate, or risk changes.

Mini Flow (How It’s Done)

  • Confirm inputs and data quality: risk register with well-defined risks, schedule network, cost estimates, assumptions, and constraints.
  • Build the model: add uncertainty to durations and costs (e.g., three-point estimates) and include discrete risk events with probabilities and impacts.
  • Select distributions and dependencies: choose suitable probability distributions and model correlations where variables move together.
  • Run analysis: use simulation for cost and/or schedule, perform sensitivity checks (e.g., tornado charts), and use decision trees or expected value where choices exist.
  • Interpret results: review percentiles (e.g., P50, P80), ranges, probability of meeting targets, and the top risk drivers.
  • Decide and update: propose contingency reserves, refine risk responses, adjust targets if needed, and update the risk report and risk register.
  • Communicate and iterate: explain assumptions and confidence levels, then rerun when plans or risks change.

Quality & Acceptance Checklist

  • Scope, schedule, and cost models reflect the current baseline and logic ties are valid.
  • Key threats and opportunities are modeled, with clear probabilities and impacts.
  • Uncertainty ranges for major estimates are justified and sourced.
  • Chosen distributions and correlations are reasonable and documented.
  • Outputs include percentiles, probability of meeting key targets, and a list of main risk drivers.
  • Recommended contingency is traceable to analysis results, not arbitrary padding.
  • Assumptions, data quality limits, and scenario choices are recorded.
  • Stakeholders reviewed results and understand confidence levels and implications.

Common Mistakes & Exam Traps

  • Confusing qualitative with quantitative analysis; quantitative uses numeric models and probabilities, not just risk ratings.
  • Assuming it is required on every project; it is used when the benefit outweighs the effort and data is sufficient.
  • Ignoring opportunities; include positive impacts as well as threats in the model.
  • Leaving out correlations; dependencies between variables can materially change ranges.
  • Using point estimates only; always include uncertainty ranges for key costs and durations.
  • Misreading percentiles; P80 is not a guarantee, it is a confidence level under stated assumptions.
  • Double counting reserves; contingency comes from analysis for known-unknowns, while management reserve is separate for unknown-unknowns.
  • Running simulations on a weak schedule; missing logic ties or unrealistic calendars produce misleading results.

PMP Example Question

Your sponsor asks for the probability of finishing within the approved budget and a data-backed recommendation for cost contingency. You have already prioritized risks using likelihood and impact scores. What should you do next?

  1. Add a fixed percentage contingency based on organizational policy.
  2. Perform quantitative risk analysis using a cost model with uncertainties and discrete risks.
  3. Update the risk register with more qualitative categories.
  4. Request management reserve to cover unknown risks.

Correct Answer: B — Perform quantitative risk analysis using a cost model with uncertainties and discrete risks.

Explanation: The sponsor wants probabilities and defensible contingency. This requires quantitative analysis using a numerical model, not qualitative scoring or arbitrary percentages.

Advanced Project Management — Measuring Project Performance

Move beyond guesswork and status reporting. This course helps you measure real progress, spot problems early, and make confident decisions using proven project performance techniques. If you manage complex projects and want clearer visibility and control, this course is built for you.

This is not abstract theory. You’ll work step by step through Earned Value Management (EVM), learning how cost, schedule, and scope come together to show true performance. You’ll build a solid foundation in EVM concepts, understand why formulas work, and learn how performance data actually supports leadership decisions.

You’ll master Work Breakdown Structures (WBS), control accounts, and budget baselines, then apply core EVM metrics like EAC, TCPI, and variance analysis. Through a detailed real-world example, you’ll forecast outcomes, analyze trends, and understand contingencies and management reserves with confidence.

Learn how experienced project managers monitor performance, communicate results clearly, and take corrective action before projects slip. With practical exercises and hands-on analysis, you’ll be ready to apply EVM immediately. Enroll now and start managing performance with clarity and control.



Take Control of Project Performance!

HK School of Management helps you go beyond status reports and gut feelings. In this advanced course, you’ll master Earned Value Management (EVM) to objectively measure progress, forecast outcomes, and take corrective action with confidence. Learn how WBS quality drives performance, how control accounts really work, and how to use EAC, TCPI, and variance analysis to make smarter decisions—before projects drift off track. Built around real-world examples and hands-on exercises, this course gives you practical tools you can apply immediately. Backed by our 30-day money-back guarantee—low risk, high impact for serious project professionals.

Learn More