I Know Weather, And You Sir Are No Weather
Stop for a moment and think about how much trust you have in a weather forecaster.
Our estimation of weather forecasting is influenced by a memory bias called the Von Restorff effect. Which states simply that people remember things that stick out. When a weather forecast is incorrect we may be unprepared or otherwise negatively impacted by the event, thus making it more memorable… the wedding that is rained out, the day at the part that was rescheduled and it turns out to be nice, etc. When the forecast is correct, it is a non-event. Nobody takes notice when the expected occurs.
We have this in information security as well. We remember the events that stick out and forget the expected which skews our memory of events. But that’s not what I want to talk about. I want to talk about the weather. Even though we incorrectly measure the performance of forecasters, predicting the weather with all available technology, is very, very difficult. Because let’s face it, weather forecasters are often quite brilliant, they are usually highly educated and skilled (with a few exceptions of TV personalities). Nobody else in that position would fare any better predicting the weekend weather. Let’s see how this ties to measurements in our information systems.
Edward Lorenz was re-instantiating a weather modeling program in 1961 with data points from a print out. But rather than keying in a full “.506127”, he rounded and keyed in “.506”. The output from this iteration produced wildly different results from his previous runs. This one event started the wheels in motion for an awesome Ashton Kutcher film years later. Lorenz had this take away about this effect:
If, then, there is any error whatever in observing the present state – and in any real system such errors seem inevitable – an acceptable predication of the instantaneous state in the distant future may well be impossible.
I want to combine that with a definition for measurement from Douglas Hubbard:
Measurement is a set of observations that reduce uncertainty where the result is expressed as a quantity.
What’s this have to do with infosec?
Hubbard is saying that measurement is a set of observations and Lorenz is saying that any error in observation (measurement), renders an acceptable prediction as improbable. The critical thing to note here is that Lorenz is talking about weather which is a chaotic system. Information systems are not chaotic systems (though some wiring closets imitate it well). The point is that we should, in theory, be able to see a benefit (a reduction in uncertainty), even with error in our measurements. In other words because we are not in a chaotic system we can deal with imperfect data.
I have a picture in my head to describe where information systems are on the continuum from simple to chaotic. I see a ball in the center labeled “simple”, and flat rings circling it and expanding out from it. The first ring being labeled “hard”, then “complex” and finally “chaotic” as the outer ring. Weather systems would be in the outer ring. For the I.T. folks, I would put an arrow pointing to the last bit of complexity right before chaos and write “you are here” on it. Complexity is the “edge of chaos and infosec is based entirely on complex systems. Take this statement from “How Complex Systems Fail” by Richard Cook
Catastrophe is always just around the corner.
Complex systems possess potential for catastrophic failure. … The potential for catastrophic outcome is a hallmark of complex systems. It is impossible to eliminate the potential for such catastrophic failure; the potential for such failure is always present by the system’s own nature.
Go through this exercise with me: Think of the most important info system at your company (probably email service). Think of all the layers of availability-protection piled on that to keep it operational. Clustering, backups, hot-swap whateverables. Would any of the technical staff in charge of it – the subject matter experts – ever be surprised if it stopped functioning? It shouldn’t and in most cases, probably won’t, but who would be surprised? I think everyone in this industry knows that the phone can ring at any moment with a message of catastrophe. As Cook stated this attribute is a “hallmark of complex systems”.
The first step in solving any problem is to realize there is a problem.
Just as it’s good to become aware of the cognitive biases we have towards high profile events (like bad weather forecasts), we also have biases and framing errors when it comes to complex technical systems. For me, simply realizing this and writing it down is a step up from where I was yesterday. I am beginning to grasp that there are no easy decisions – and just how much mass and momentum that statement has.