Is Psychology of Security Causing Cybersecurity Problems?

What do I mean by Cybersecurity problems?  How about not patching  or upgrading your devices, taking inventory of your devices.  Making changes to the network or systems and not thinking about security.

Or just plain old errors, mistakes, issues that arise after something new happens.

Why would we not pay attention to these things?

What if our managers do not understand the reasons why one should patch or upgrade? And to discuss to an expert it requires money and resources? Would you pay an expert to prevent losing money? This is the principle of Psychology of Security  first put together by Bruce Schneier: Psychology of Security (part 1)   

Bruce’s words:

“Security is a trade-off. This is something I have written about extensively, and is a notion critical to understanding the psychology of security. There’s no such thing as absolute security, and any gain in security always involves some sort of trade-off.

Security costs money, but it also costs in time, convenience, capabilities, liberties, and so on. Whether it’s trading some additional home security against the inconvenience of having to carry a key around in your pocket and stick it into a door every time you want to get into your house, or trading additional security from a particular kind of airplane terrorism against the time and expense of searching every passenger, all security is a trade-off.”

There are some interesting axioms about risk – or general pathologies as Bruce Schneier mentions:

  • People exaggerate spectacular but rare risks and downplay common risks.
  • People have trouble estimating risks for anything not exactly like their normal situation.
  • Personified risks are perceived to be greater than anonymous risks.
  • People underestimate risks they willingly take and overestimate risks in situations they can’t control.
  • Last, people overestimate risks that are being talked about and remain an object of public scrutiny.

What are some examples of the above?  One could ask – what is a ‘rare’ risk?

David Ropeik and George Gray have a longer list in their book Risk: A Practical Guide for Deciding What’s Really Safe and What’s Really Dangerous in the World Around You:

  • Most people are more afraid of risks that are new than those they’ve lived with for a while. In the summer of 1999, New Yorkers were extremely afraid of West Nile virus, a mosquito-borne infection that had never been seen in the United States. By the summer of 2001, though the virus continued to show up and make a few people sick, the fear had abated. The risk was still there, but New Yorkers had lived with it for a while. Their familiarity with it helped them see it differently.
  • Most people are less afraid of risks that are natural than those that are human-made. Many people are more afraid of radiation from nuclear waste, or cell phones, than they are of radiation from the sun, a far greater risk.
  • Most people are less afraid of a risk they choose to take than of a risk imposed on them. Smokers are less afraid of smoking than they are of asbestos and other indoor air pollution in their workplace, which is something over which they have little choice.

 

We are not interested in geopolitical risks even if they explain general risk taking, we have to focus on Cybersecurity risks.

People tend to focus on exotic ransomware and malware with fancy names.  do you remember the most famous viruses?   Cryptolocker($30mil in 100 days), ILove you?(Y2000 email with subject  ‘I Love You’ and as this was the first time – cost of the malware: $15Billion. MyDoom was the most damaging with 16-25% of all email being infected with MyDoom. Storm Worm was picked up in 200 million emails.

 

Both of the viruses WERE very successful as Norton mentions.   the notoriety happened because they were first of a kind, and thus caused many problems. So NOW that everyone knows about the “I LoveYou” in the subject line will you click on that email?

 

Even though now it is less likely that you receive an ILoveyou email we all still talk about it as if it is still here haunting us.  MyDoom was so prevalent for a little while some people still bring that up.

 

It is good to be prudent before opening an email as a general rule, but should you change your Cybersecurity threat assessments due to MyDoom and Cryptolocker from the early 2000’s?

The problem with Security is that it IS a feeling a potential weight on your shoulders, so instead we conveniently ignore it.

 

There are multiple reasons that security barely plays in the background.

  1. Complexity breeds ignorance
  2. Risk is misunderstood, because the urgency of the problem is misunderstood
  3. But ULTIMATELY the problem is that many humans do not take cybersecurity like they should.  Why you may ask?  Because 30% of people are likely to take risks with a specific gamble.  30% do not want to pay to possibly lose less  in resources and money. Essentially a lot of people are gambling that nothing will happen to them so they do not have to spend money.

Little do they know how easy it is to attack a million computers – and if ignorance causes their computer to be caught in the maelstrom (like the old MyDoom or others) then it will happen because they were unwilling to spend a little time and money to prevent this from happening.

It takes time to change the ways of gambling with your computer systems – hoping upon hope that nothing will happen.

Do you think nothing will happen if you do nothing to protect your systems?

There are things to do , to change your methods so as to make _sure_ that the systems will be ok. reducing the likelihood of risk is possible.

Contact Us to discuss.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.