Archive

Archive for September, 2010

Statistical Literacy

September 23, 2010 2 comments

If I could choose one subject to force everyone to become literate in, it’d be statistics, specifically around probability and randomness.  Notice I didn’t say that people should learn  statistics.  People have done that.  They memorized facts and formulas long enough to pass a course, but most people are not literate in it.  They continually fall prey to basic errors any imagegood professor warns students of.  So I thought I’d toss out a couple of things to get your juices flowing around statistical literacy. 

The first entry in this is a quick TED talk that lasts 3 minutes from Arthur Benjamin on a formula for changing math education.  I’m linking his talk here just to wet the pallet. 

The next is a very old book in technical terms but still very applicable, It’s titled “How to Lie With Statistics” by Darrell Huff.  It’s a relatively short book, and easily consumable.  Coming in at under $10, it’s something that everyone should have on their bookshelf.  I was amazed at how much advertising and sales tactics still leverage the techniques listed in this book.  It is an excellent step towards statistical literacy.

Next was the impetus for this post, I found a link to the Statistics Policy Archive over at the Parliament’s website.  For example the 7 page guide on Uncertainty and Risk has this little nugget about estimation, “At the very least a range acknowledges there is some uncertainty associated with the quantity in question.”  I’d like some coworkers to understand that statement.  Doing things like reducing resources for a volatile project to a single number drives me stinkin’ nuts.  Keep in mind that these appear to be written towards policy makers in the parliament who need a primer on statistics.  I’m still making may way through these docs.

Last point of reference is one of the best books I’ve ever read, “The Drunkard’s Walk: How Randomness Rules Our Lives” by Leonard Mladinow.  Not only does he have a light and easy writing style, he defines probability by describing the lack of probability: randomness.  Aside from being an excellent geek-out topic, randomness and probability are part of every daily task and decision we make.  This book takes the reader two steps back and then three steps forward. 

I bring all of this up because probability, randomness and statistics are so integrated into daily life, and are especially prevalent in I.T. and security.  As we talk about security controls, risks and improving the world we’re talking probability.  When we architect a solution, code up an application, audit a system we are surrounded by probabilities.  Becoming literate in statistics and probabilities is something I’d like to see more of because I think that would improve every aspect of our profession more than anything else at this point.

Advertisements
Categories: Uncategorized

My Complex Hospital Stay

September 5, 2010 Comments off

I spent some time last week in the hospital having a new foot built for me.  I don’t want to It's Happy Bunny!dwell too much on the details, but while I was in the hospital, in that drugged-induced stupor, I was thinking about “How Complex Systems Fail” by Richard Cook, MD.  After all he wrote that about the complex healthcare system, not Information Systems and I was hoping to get some pearl or enlightenment while laying there.   It’s been years since he wrote it and I was just hoping to see some little nugget that I  could take back to my day job.  However, what I found wasn’t illuminating at all, just more reinforcement of why that paper should be mandatory and memorized by all infosec people (and healthcare people).

I’m going to start off with my summary what I observed about healthcare that I think applies nicely over to infosec.

Focus on the Basics**

Citing from Cook’s paper, here’s the statement that pretty much summed up my stay:

Overt catastrophic failure occurs when small, apparently innocuous failures join to create opportunity for a systemic accident.

I had no catastrophic failure while staying in the hospital, but I had an endless supply of small failures and luckily no systemic accident.  Anyway, I love how that’s worded and I feel that may be addressed by focusing on the basics.  Any one part of a complex system may not be complex by itself, but become complex when they mix or intertwine with other parts of the complex system.  I think my care at the hospital (and information assets at work) could be better taken care of if we simply focus on the basics and work on doing them better.  I’m going to walk through two scenarios to help illustrate the point.

Scenario 1: Apply Ice

During my stay, advice from different “expert” practitioners would contradict others advice on moderately unimportant topics.  Things like how my boot is adjusted, benefits of elevation or even the purpose behind this or that drug.  I was able to easily understand this, I often find disagreement in advice from “experts” in our field.  But the separation between advice and practice is where I saw some interesting breakdown.

Nobody would say that icing wasn’t helpful.  It’s well understood that icing reduces swelling and would speed my recovery.  But it’s the details that diverge and how far away reality was that really surprised me.  One advice-giver was specific enough to say 15 minutes of every hour should be spent icing.  However, during my 2.5 days in the hospital I received 2 ice packs and both times I found it still under my leg with the ice melted hours afterwards.  Both times I had to pry the ice packs out from my leg and dump it onto the floor.

So how helpful is icing?  Like I said nobody would say it wasn’t helpful.  But according to their actions, it was less helpful than checking vitals and only slightly less helpful than emptying the bed pan.  Right?  We’ve got an imbalance in healthcare – a misperception on the cost vs. reward of certain activities.  We also have an imbalance in infosec.  We’ve got a misperception on the costs/rewards of certain technologies and are not doing things like checking logs as often as we should.  Like I said, let’s focus on the basics.

Scenario 2: Vital Signs

When I first arrived for surgery, they attached a blood pressure cuff on my arm and an oxygen monitor on my finger.  I was told these would stick with me and it was decided to attach these to my left arm after talking about my right-handedness.   During my stay, folks would come in at some interval (2-4 hours) and “check my vitals”.  At one point during the first night, when I was barely sensing reality, I woke up and realized that I was sleeping with two blood pressure cuffs and two oxygen monitors, one set on my left side and one on my right.   Once I pointed this out to the vital-checker, they removed both.  No, I don’t know why.  This is was Cook refers to as applying an “end-of-the-chain measure.”

How does this apply to infosec?  We need to be aware of misattribution of mistakes.  It’s easy to look at this and say that some nurse or assistant screwed up, and they should be scolded or publicly mocked or something.  But in reality we need to understand that this was a symptom.  We have to back up and truly figure out root cause, not enough time to communicate?  Is the culture anti-organizational-skills?  I’d like to say that I’ve never seen this in infosec, but I’d be lying.  We’ve all seen security controls installed/mandated that are redundant or worse, harmful (*cough* transparent encryption *cough*).

All in all, we have the tools and we have the ability.  I’d like to make a call out for avoiding catastrophic suckiness by simply focusing on the basics.   We need to focus on doing what we know how to do and we need to apply it coherently.  Pretty simple request I’d say.

**This post and summary is dedicated to my crappy and non-catastrophic stay at Regions Hospital, Saint Paul, MN.

Categories: Uncategorized