Home > Risk > How much cyber risk should an organization take?

How much cyber risk should an organization take?

I did a video with Joe McCafferty of MISTI last month. He wrote about it here, and you can find the video on YouTube.

I am interested in whether you share my views.

I also have some questions for you – after you watch the video:

  1. Should we be measuring cyber risk in relation to the potential effect of a breach on business objectives? Or should it be based on the effect on information assets?
  2. Do we know how to assess the level of risk?
  3. Are we doing a good job knowing how much risk we need to take to achieve our objectives? In other words, are we excessively risk averse or embracing of risk – and do we really know whether we are making the right business decision?
  4. Does it all come down to ROI, the cost and the value of additional investment in cyber prevention, detection, response, and remediation?
  5. Are we hyperventilating about cyber when there are more important risks to address?

I welcome your comments and answers.

  1. Jim DeLoach
    January 7, 2017 at 9:55 AM

    Great questions, Norman. Happy new year.

    • Norman Marks
      January 7, 2017 at 9:56 AM

      and to you, Jim

  2. January 7, 2017 at 10:44 AM

    Hi Norman.
    1. all risks, including cyber risks, should be assessed against the objectives they are threatening
    2. as with any risk, ask the question, ‘if this risk occurs, how much do we lose?’
    3. the business that knows the answer to this question is the one that succeeds
    4. probably. though it’s worth asking the question, ‘if I have to stand in front of a press conference, will I be able to present a convincing case that we had all reasonable measures in place to prevent and detect the risk and take appropriate action when it occured?
    5. in my last organisation (homes for the elderly), one of the greatest risks was poorly lit stairs resulting in falls.

  3. January 7, 2017 at 4:05 PM

    The CISSP certification materials propose that the basis for Information Security when there is not National Interest, Legal or Regulatory requirement to be proportionate to the expected loss. The two real weaknesses in the proportionate approach come from uncertainty in likelihood or impact. In some cases, proportionality does not even really make sense. Consider the an avoidable medical patient death. Would we proportionately kill only one of the IT staff per three avoidable patient deaths? While that might actually be motivating to preventable risk to be resolved, the actual response will likely not be proportionate.

    Valuing data, operational uptime, favorable and loyal customer perceptions is practical to do. Unfortunately, basically no Cyber Risk certifications actually provide skill set training in this area. In my case, to better support this Cyber Risk valuation, I got both an MBA and Six Sigma Black Belt certification to help. If you do not have about $50,000 and two years of your life to spare, I have some really helpful suggestions and some trust that Wikipedia and Excel Help may lead you to build upon these.

    Your Board, C-level, VP and Director audience seldom got their PhD in Information Technology or Information Assurance. So, we have to present risk in a form that allows them to hear you. All of these can actually through darts at a board or read magic 8-balls all by themselves to find out if a Risk should be “High”, “Medium” or “Low”. Further, Big Four Audit Houses and Information Security Scanners are producing scores for risks on either a scale of 1 to 5 or 1 to 10. But, none of these will tell you how much more money is wasted if my risk is a 4 vs a 3. That decision maker has no idea how to cost a cost effective project. This is where some of this math will help a lot.

    MBAs are trained to evaluate time based risk using Poisson Statistics. A lot of data than can help calibrate this is actually oozing out of annual risk reports. The following is a classic.

    35% of firms surveyed reported that they had not been breached in the last 2 years.
    From Poison Statistics, tracks the odds of exactly 0, 1, 2, 3, 4,…. events might happen.
    From the odds of nothing happening, 35% = exp(-L*2years), L = 0.524911 expected events/yr per firm in this study. It turns out that L is effectively equivalent to the expected average dice result. Sample dice average, If I roll a 6 sided dice three times the average total should be 10.5. So, we just worked out the average number of adverse events times the odds by computing L.

    More important to an MBA is how long does it take on the average for me to have to open my paycheck and pay lawyers, reimburse customers, pay fines to American Express, Visa, Master Card, JCB, Discover Card or PayPal. This turns out to be easy, it is 1/L. It takes on average 1.905 years for each of the firms in this study to breach data one or more times.

    The good news is that Information Security data on how much money on is going to lose is much more easy to get these days. You could find a better loss model than the following really easy. But, to start lets look at Credit Card Losses with a twist. I turns out their are fixed costs in a lost and then variable costs per stolen credit card. Saving you 3 years of my life, I present a good first start. Loss/Breach = 3 million USD + $7.25/Credit_Card

    So from this very simply model, we can look at a firm’s loss exposure per year for stolen credit cards if it is somehow like these firms where 35% had nothing happen. We could simply look at L * Loss = 0.524911 Events/yr * (3 Million USD + $7.25 * Cards_lost)/Event. A good starting point.

    But, the argument starts, how do you know our firm will experience attack risks like those firms in that Security Study? The truth is that unless we are the villain, we have no idea when that event is going to occur. Instead we have a range much like point spreads when betting on a football game. Again, Poisson Statistics to the Rescue.

    A fun curve fit of the likelihood spread between 2.5% to 97.5% likely to occur looks as follows. This formula has to be dice expectation, so that it is L*T for time. L*T +/- sqrt(3.829459*L*T-0.54884). The charm of 95% confidence intervals is that it becomes legal to guarantee something rather than just claiming its true to the best of your professional ability. If if you did not want to guarantee anything, presenting stuff normally that good is good for a Risk Analyst’s reputation. Also, the results can become scary good.

    Finally, consider learning about Monte Carlo Simulations, even invest in a tool even if your boss won’t fund it. If your uncertainties of input are defensible then the effect on your outputs can almost look like magic. There is a reason why high end firms, stock brokers and Quality Assurance professionals like these. The cost of quality estimates win respect and fund the size of improvement projects.

    A final note. Please, Please, Please do not use ROI when what your really mean in Return On Cost Savings. An MBA can spot a fool from 1000 yards, go to Investor education sites and learn how to compute an ROI correctly if an Information Security project will actually make more money than it cost to implement. Or, go for a Return On Cost Savings if an InfoSec project will cause money to be lost at a slower rate if done correctly. Usually, InfoSec projects are Return On Cost Savings. For a project costing $100,000 I can reduce the Cyber Breach losses of Credit Cards from 3.5 million per year to 0.5 million per year. The ROI here is negative — use a Return On Cost Savings. I did change the bottom line so if sales keeps clocking gross revenue income, we will get to keep more of it, but Sales made the income, InfoSec just allows us to keep more of it.

    With this, you can leave the risk wreckage yard of “High”, “Medium” and “Low” and blow by Risk Professionals stuck on a valueless scale of 1 to 5 and actually take a better path that can make all the difference.

    Compute Responsibility.

    • Norman Marks
      January 7, 2017 at 5:24 PM

      Don, thanks for sharing your valuable insights. The first problem I have is that when it comes to likelihood, most people don’t know when they have been hacked. They should know that they are being attacked hundreds if not thousands of times each HOUR! You almost have to assume certainty of a breach of some sort. Then the trick is knowing the likelihood of each possible consequence of a breach. The second problem is that people don’t consider the effect on the business and its objectives. They use some form of value for information assets.

      Your thoughts?

  4. January 7, 2017 at 4:31 PM

    Want to turn your CFO’s head on a dime? Consider the following.

    Suppose I had an IT process with a Mean Time Between Failure that will result in a known loss just about every time it happens? There is a Net Present Value technique improvement on Odds times Impact. Using Continuously compounding interest and a charming connection with Poisson Statistics for the Mean Time Between Failure.

    M = Mean Time Between Falure = T/L = 1 / (0.524911 events/yr)
    R = Return On Invested Capital (ROIC) the average for US firms is 12.5%/yr
    Loss = cost occurring when bad things happen more or less every M years apart.

    NPV = Loss * exp(-R*M) + Loss * exp(-2*R*M) + Loss * exp(-3*R*M) … forever…

    NPV = Loss/ (exp(R*M)-1)

    Now that you know how much money a firm has to have in reserve earning 12.5%/yr to save up an pay for a rolling series of adverse events. We can compute the annualized cost, how much money I should sock away every year to be able to pay these bills when they hit — on the average.

    Risk Exposure = R * NPV = R * Loss / (exp(R*M)-1)

    What if I do not want a definitely attentive CFO but maybe an abject Happy Dance from an CFO or at least his accounting lead. (Really, some of them actually leap out of their chairs.)

    The Loss I am going to experience has some fixed costs and some variable costs the longer it takes me to clean up. Can these be accounted for? Why, yes.

    Suppose the average time taken to clean up a loss were N (in years).

    Loss = Fixed_cost + Variable_Cost_Per_Year / R * (1-exp(-R*N)

    So, the average Risk of a probable, rolling series of failures that have a fixed loss followed by time based costs to clean up comes to.

    Risk Exposure = R* NPV = R * (Fixed + Variable/R*(1-exp(-R*N))/(exp(R*M) -1)

    Suppose I could input my uncertainties — point spreads into this — I could find out what my 95% confidence interval for losses of this class of unfixed InfoSec failures would be. Things that change my Fixed costs, my Variable costs, make my Mean Time Between Failures longer or my N (Mean Time To Repair) shorter save me measurable cash.

  5. January 7, 2017 at 4:49 PM

    One more free gift. I turns out that the risk of an adverse InfoSec event is actually proportionate the the number of computers in the attack surface. So, it is very wise to back out the size of firms sampled in the InfoSec study. But, there are other ways to rough out this per computer risk.

    Suppose I knew that the odds of a fraud per person were 0.066%/year. What if I knew or could work out that on average there are 3.5 computers per employee in the USA. The odds a computer would be involved in that fraud out be 0.017%/yr. I could multiply this by the number of computers a firm has and get an estimate of the number of fraud events per year for a firm that size. L = computer_count * computer_frauds_per_year * Years…

    Odds of having no computer based fraud in two years = exp(-L*2 years)
    This is roughly why some small firms feel lucky, but die from the cash hit when it happens, and large firms do not feel lucky but can handle the cash hits per event.

    • Norman Marks
      January 7, 2017 at 5:26 PM

      Don, most CIOs don’t know how many devices are on their network. Think of smart offices, advanced automation, and so on. Its not about computers, per se.

  6. January 9, 2017 at 8:08 AM

    1) Cyber risk should be measured like all other risks, so based on the effect on the bussiness objectives.
    2) The problems of assessing the level of risk are equal to others risks: most organisations don’t keep records of incidents and the different kind of costs. Organisations with a certain maturity on security know their firewalls receive hundreds or thousands of attacks per day. They can’t be sure of all attacks were unsuccessfull. The previous reactions give exampels of how to calculate the risks. You can also look of what is did cost other companies that where vintims of cyber related incidents. The problem is that they seldom share that information.
    3) As long as the business doesn’t have enough knowledge of cyber secuity they will take systems in operation (like a new company website) even when basic security is not in place.
    4) It should be based on both risk analysis and cost/benefit analysis, but the problem addressed by number 2 (lack of informtion) makes this impossible.
    5) The number of incidents with ddos, malware, ransom ware,etc., shows that cyber risk is true. It will mainly depend on size and kind of business if an organisation should be more worried than others.

  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: