Home > Humor, Psychology > Biases and Fallacies: Infosec Style

Biases and Fallacies: Infosec Style

May 18, 2010

I’m starting a blog and rather than give you some long drawn out intro into who I am and why I think I have something to say I’ll just jump right in right after this short introduction:  I enjoy information security, I enjoy cryptography and I enjoy discussions on risk.

On to my point here.

One of the biggest problems in security is human element. Wait, let me rephrase that, one of the biggest areas I think security could be improved is by accounting for the human element.  Humans are full of logical fallacies and cognitive biases.  They are horrible at collecting, storing and retrieving information (especially sensitive information) and they stink as cryptographic processors.  Yet, they are essential components and have one great quality: they can be predictableFallacy Detective

One of the things we can do as owner/operator of one of these machines is to understand how we work (and conversely don’t work).  To that end, I’ve read (and re-read) various articles, sites and books about our biases and fallacies.  There are plenty of good sources out there, but the only way I’ve found to overcome biases and fallacies is to learn about them and try to spot them when we encounter them, and most everyone experiences these daily.  Please feel free to take some time, read through the lists and try to keep an eye out for them in your daily routine.

To kick things off, the infosec industry has some easy-to-spot biases.  We have the exposure effect of best practices, and the zero-risk bias is far too prevalent, especially in cryptography.  We have enormous problems with poorly anchored decisions (focusing on the latest media buzz-words) and also we are drowning in the the Van Restorff effect on sensational security breaches.  But that’s not what I’m really going to talk about.  As I was reading the Wikipedia entry on the List of Cognitive Biases, I thought some of them sounded completely made up.  That got me thinking… Hey! I could make some up too.  We must have our own biases, errors and fallacies, right?  I mean, we have enough false consensus that we can think we’re unique and have our own unique biases and fallacies, right?

After one relatively quiet evening full of self-entertainment, I have come up with my first draft of security biases and fallacies (I will spare Wikipedia my edits).  Some of these are not exclusive to infosec, but made the list anyway.  In no particular order…

Defcon Error: Thinking that there are only two modes that computers operate in: Broken-with-a-known-exploit and Broken-without-a-known-exploit.

Moscone Effect: an extension of the Defcon effect but additionally thinking there is a product/service to help with that problem.

PEBCAK-Attribution Error: thinking that security would be easy if it weren’t for those darn users.

Pavlov’s Certificate Error: Thinking that it is somehow beneficial for users to acknowledge nothing but false-positives.

Reverse-Hawthorne Effect:  The inability to instigate change in spite of demonstrating how broken everything is, also known as the metasploit fallacy.

Underconfidence effect: The inability to instigate change in spite of rating the problem a “high”, also referred to as risk management.

The Me-Too error: Reciting the mantra “compliance isn’t security”, but then resorting to sensational stories, unfounded opinions and/or gut feel.

CISSP Error: ‘nuf said

The Cricket-Sound Error: thinking that other I.T. professionals value a secure system over an operational system.

The Not-My-Problem Problem: Thinking that administrators will weigh all the configuration options carefully before selecting the best one (rather than the first one that doesn’t fail).

Policy Fallacy:  a logical fallacy that states: Companies produce policies, employees are supposed to read policies, therefore employees understand company policies.

The Mighty Pen Fallacy: an extension of the Policy Fallacy, thinking that just by writing something into a policy (or regulation), that dog will poop gold.

Labeled-Threat bias: Thinking that a pre-existing threat must be solved through capitol expenditure once it has a label, also known as the the “APT bias”, more common among sales, marketing, some media and uninitiated executives.

Illusory Tweeting: thinking the same media used to instantaneously report political events and international crisis’ around the world is also well suited for pictures of your cat.

Security-thru-Google Fallacy: I don’t have one for this but the name is here for when I think of it.

Encryption Fallacy:  if encrypting the data once is good, then encrypting it more than once must be better.

Tempest Bias: the opposite of a zero-risk error: the incorrect thinking that rot13 doesn’t have a place in corporate communications.

I know there are more…  Happy Birthday blog o’ mine.

  1. May 18, 2010 at 10:21 am

    I like the “Schneier Fallacy”: Something must be done! This is something; therefore it must be done.

  1. No trackbacks yet.
Comments are closed.