Archive

Archive for August, 2011

The Simple Power of OpenPERT: ALE 2.0

August 17, 2011 Comments off

Yup, Chris Hayes (and I) have released the 1.0 version of OpenPERT.  I had a sneaking suspicion that most people would do what I did with my first excel add-in.  “Okay, I installed it, now what?” then perhaps skim some document on it, type in a few formulas and walk away thinking about lunch or something.   In an attempt to minimize OpenPERT being *that* add-in, we created something to play with – the next generation of the infamous ALE model.  We call in “ALE 2.0

About ALE

ALE stands for Annualized Loss Expectancy and it is taught (or was taught back in my day) to everyone studying for the CISSP exam.  The concept is easy enough: estimate the annual rate of occurrence (ARO) of an event and multiply it by the single loss expectancy (SLE) of an event.  The output of that is the annualize loss expectancy or ALE.  Typically people are instructed to use single point estimations and if this method has ever been used in practice it’s generally used with worst-case numbers or perhaps an average.  Either way, you end up with a single number that, while precise, will most likely not line up with reality no matter how much effort you put into your estimations.

Enter the next generation of ALE

ALE 2.0 leverages the BETA distribution with PERT estimates and runs a Monte Carlo simulation.   While that sounds really fancy and perhaps a bit daunting, it’s really quite simple and most people should be able to understand the logic by digging into the ALE 2.0 example.  

Let’s walk through ALE 2.0 by going through a case together: let’s estimate the annual cost of handling virus infections in some made-up mid-sized company. 

For the annual rate of occurrence, we think we get around 2 virus infections a month on average, some months there aren’t any, and every few years we get an outbreak infecting a few hundred.  Let’s put that in terms of a PERT estimate.  At a minimum (in a good year), we’d expect maybe 12 a year, most likely there are 30 per year and bad years we could see 260 outbreaks. 

For the single loss expectancy, we may see nothing, the anti-virus picks it up and cleans it automatically and there’s no loss.  Most likely, we spend 30 minutes to manually thump it, worst case we do system rebuilds, taking 2 hours, non-dedicated time but there are some other overhead tasks.  Putting that in terms of money, we may say minimum is $0, most likely, oh $50 of time, worst case, $200.

image

Now let’s hit the button.  Some magic happens and out pops the output:

image

There were a couple of simulations with no loss (min value) from responding to viruses given those inputs, on average there was about $4,400 in annualized loss and there were some really bad years (in this case) going up to around $35k.  There are some statements to make from this as well, like "10% of the ALE simulations exceeded $9,000.” 

But the numbers, or even ALE for that matter aren’t what this is about, it’s about understanding what OpenPERT can do.

What’s going on here?

Swap over to the “Supporting Data” tab in Excel, that’s where the magic happens.  Starting in cell A2, we call the OPERT() function with the values entered in for ARO.  The OPERT function takes in the minimum, most likely, maximum values and an optional confidence value. In the cell, the function returns a random value based on a beta distribution of values.  This ARO calculation is repeated for 5000 rows in column A, that’s the Monte Carlo portion.  Column B has all of the SLE calculations (OPERT function calls in SLE estimations) for 5000 simulations and column C is just the ARO multiplied by the SLE (the ALE for the simulation).

In summary, this leverages the OPERT() function to returns a single instance for the two input estimations (ARO and SLE) and we repeat that to get a large enough sample size (and 5000 is generally a large enough sample size, especially for this type of broad-stroke ALE technique).

Also, if you’re curious the table to the right of column C is the data used to construct that pretty graph on the first tab (show above).

Next Steps

The ALE method itself may have limited application, but it’s the thought process behind it that’s important.  The combination of PERT estimations with the beta distribution to feed into Monte Carlo simulations is what makes this approach better than point estimates.  This could be used for a multitude of applications, say for estimating the piano tuners in Chicago or any number of broad estimations, some of them even related to risk analysis hopefully.  We’re already working on the next version of OpenPERT too, in which we’re going to integrate Monte Carlo simulations and a few other features.

It would be great to build out some more examples.  Can you think of more ways to leverage OpenPERT?  Having problems getting it work?  Let us know, please!  

Categories: Uncategorized