UK Government guidance on risk and cyber: the very good and the very bad

November 2, 2018 2 comments

The National Cyber Security Center (NCSC) is a part of the UK’s Government Communications Headquarters (GCHQ). If you are like me, you may have only heard about GCHQ in an unflattering context, that of working with US intelligence agencies to spy on foreign heads of state and hack foreign agencies.

As a UK intelligence organization, they seek to keep its citizens safe. Through the NCSC, it provides advice on cyber security.

I am going to reference two pieces of NCSC guidance. The first is great and the second terrible.

In December 2017, the NCSC published The fundamentals of risk.

Here are some excellent insights from that publication:

  • Risk management exists to help us to create plans for the future in a deliberate, responsible and ethical manner.
  • The purpose of risk management is to enable us to make the best possible decisions, based on our analysis of future events and outcomes. The future can be anticipated, but within limits defined by our uncertainty in our analysis.
  • This requires risk managers to explore what could go right or wrong in an organisation, a project or a service, and recognising that we can never fully know the future as we try to improve our prospects.
  • Risk management is about analysing our options and their future consequences, and presenting that information in an understandable, usable form to improve decision making.
  • Risk Management often requires a relationship between people who analyse risksand people who make decisions based on that analysis. Communication between these two groups must be clear, understandable and useful. If the people who make decisions can’t interpret the analysis they’re presented with, then there is little point in doing risk analysis at all.

This is consistent with what I have said here and in my books.

Risk management has to help those in leadership make informed and intelligent decisions. That requires using business language rather than technobabble and presenting information about risk in a way that is actionable.

For example, provide information about cyber risk that enables executive management and the board to determine whether it makes more sense to invest in addressing that risk, a new marketing program, an acquisition, or in hiring additional product developers.

Saying that a risk is ‘high’ does not help management. Should they invest limited resources in mitigating something that might happen with some level of pain, or in a revenue-generating initiative that is seen as highly likely to succeed?

 

The second publication is from September, 2018: Board toolkit: five questions for your board’s agenda. It says that “the NCSC have identified a range of questions which will help generate the right discussions between board members and their CISOs and increase awareness of key topics in cyber security”.

The five questions are simply wrong. They are down in the weeds instead of addressing the big picture:

  1. How do we defend our organisation against phishing attacks?
  2. How does our organisation control the use of privileged IT accounts?
  3. How do we ensure that our software and devices are up to date?
  4. How do we make sure our partners and suppliers protect the information we share with them?
  5. What authentication methods are used to control access to systems and data?

I can barely see question #4 on my list of top ten or so questions to ask.

 

Here are the top five questions I think the board should be asking about cyber risk. (There are obviously more depending on the answers.)

  1. How could a cyber breach affect our business? What business objectives might be affected, by how much, and what is the likelihood?
  2. What’s the worst that could happen and how likely is that? How likely is it that a cyber breach would result in an unacceptable level of harm and how do you define that level of harm?
  3. How confident are you in your assessments? Who is involved in making them?
  4. Are you satisfied that we have a reasonable level of investment in the prevention, detection, and response to a breach? If not, what are you doing to bring cyber-related business risk to a level that is acceptable?
  5. How do you consider cyber-related risks in your strategic and tactical business decisions?

 

I would want the CEO to answer and not defer to the CIO or CISO.

 

What do you think?

Advertisements

Talking about risk and opportunity

October 26, 2018 6 comments

Some talk about opportunity as “the other side of the coin” from risk.

One is good and the other bad.

That is how COSO views the two words, risk and opportunity. ISO seems them differently, defining risk as the effect on objectives. That effect could be positive or harmful.

A few governance codes, such as the King IV code in South Africa, have changed their language from talking about board oversight of risk management to the oversight of risk and opportunity management.

In this view, an opportunity is where there is a possibility for action that is likely to lead to reward or gain. For example, if a homeowner is dissatisfied with his or her realtor, that is an opportunity for another realtor.

Certainly, those situations exist and organizations need to be able to recognize, understand, assess, and then seize them where appropriate.

I encourage you to view this excellent video with David Hillson (a.k.a. the Risk Doctor): Risk and Opportunity: How can risk be good?

As David points out (and I said in World-Class Risk Management and Risk Management in Plain English), the tools and techniques traditionally used to ‘manage’ potential harms (risks, in normal language) can and probably should be used to manage the potential for gain (opportunities).

Others, such as suggested in an article from software vendor Enablon, talk about How risks can turn into opportunities. The idea is that by addressing a source of risk you can create opportunities for gain.

We had that when I ran internal audit at Tosco Corp. One of our risks was the potential for changes in the relative prices of our raw materials (primarily crude oil) and products (gasoline, diesel, jet fuel, and other refined products) to adversely affect our margins and earnings. Management established a sophisticated and talented trading operation to hedge those commodities. In the process, they gained the ability to trade for profit and added to their earnings in the process. (OF course, the trading activity also created new risks.)

 

Expanding ‘risk management’ beyond a paranoid view of what might happen is progress, but is it sufficient?

 

As I wrote earlier, the level of risk is not a point. There is a range of potential consequences from an event, situation, or decision, and each has its own likelihood.

In that post, I included an illustrative chart, but all the potential consequences were negative.

In real life, there are some situations where the range of consequences might include both positive and negative effects.

In other words, the idea that risk and opportunity are different because (as David says) one has a positive and the other a negative sign is not entirely correct.

 

For example, if an organization introduces a new product with the hope that related revenue in the first year will be $800,000 or more with earnings of $180,000, that objective may be achieved or exceeded, or they may fail to achieve it.

In fact, revenue could range from the unlikely zero to the unlikely $1.5m, with many possibilities in between. If revenue is below $500,000 they would incur a loss. The chart below shows net earnings assuming a fixed cost of $300,000 and a variable cost of 40% of revenue.

range of earnings

 

The likelihood of achieving or exceeding the targeted revenue and earnings is 60%.

 

The point I am making is that events and situations can have a range of potential consequences, some of which may be negative and some positive.

In the example above, the management team has to be ready to respond should it look like the product will do better than expected (they will have to make sure manufacturing and distribution can keep pace) or worse.

 

Do the terms risk and opportunity make sense as a basis for understanding and assessing what might happen?

Isn’t it better to recognize that there is a range and we have to be prepared to address all the possibilities?

 

I welcome your comments.

SEC investigates cyber-related frauds

October 21, 2018 6 comments

On October 16th, the US Securities and Exchange Commission published Report of Investigation Pursuant to Section 21(a) of the Securities Exchange Act of 1934 Regarding Certain Cyber-Related Frauds Perpetrated Against Public Companies and Related Internal Accounting Controls Requirements.

This is an important report that risk and audit professionals should read and consider. They should also consider bringing it to the attention of the board and its audit committee.

The SEC investigated cyber-related frauds against “nine issuers that were victims of one of two variants of schemes involving spoofed or compromised electronic communications from persons purporting to be company executives or vendors”.

They said that:

“Each of the nine issuers lost at least $1 million; two lost more than $30 million. In total, the nine issuers lost nearly $100 million to the perpetrators, almost all of which was never recovered. Some of the investigated issuers were victims of protracted schemes that were only uncovered as a result of third-party actions, such as through detection by a foreign bank or law enforcement agency. Indeed, one company made 14 wire payments requested by the fake executive over the course of several weeks—resulting in over $45 million in losses—before the fraud was uncovered by an alert from a foreign bank. Another of the issuers paid eight invoices totaling $1.5 million over several months in response to a vendor’s manipulated electronic documentation for a banking change; the fraud was only discovered when the real vendor complained about past due invoices.”

The report described the schemes used, generally the result of spoofs that fooled company management and staff.

The SEC had previously issued guidance related to the disclosure of cyber-related risks and incidents. Commission Statement and Guidance on Public Company Cybersecurity Disclosures should be read and understood for its implications for risk management and on a company’s disclosure controls and procedures (the adequacy of which the CEO and CFO are required to attest in their quarterly and annual filings by s302 of the Sarbanes-Oxley Act of 2002).

In this report, the SEC states:

In light of the risks associated with today’s ever expanding digital interconnectedness, public companies should pay particular attention to the obligations imposed by Section 13(b)(2)(B) to devise and maintain internal accounting controls that reasonably safeguard company and, ultimately, investor assets from cyber-related frauds. More specifically, Section 13(b)(2)(B)(i) and (iii) require certain issuers to “devise and maintain a system of internal accounting controls sufficient to provide reasonable assurances that (i) transactions are executed in accordance with management’s general or specific authorization,” and that “(iii) access to assets is permitted only in accordance with management’s general or specific authorization.” As the Senate underscored when these provisions were passed, “[t]he expected benefits from the conscientious discharge of these responsibilities are of basic importance to investors and the maintenance of the integrity of our capital market system.”

Please note that the legal requirement to have internal controls that prevent and, if necessary, detect frauds is not new. The guidance in this new report by the SEC simply points out that the system of internal control should address the risk of fraud due to cyber-related activities. Those should include not only the spoofs discussed in the SEC report, but also any losses due to hackers breaching cyber defenses.

 

THIS IS NOT A NEW SARBANES-OXLEY s404 REQUIREMENT.

SOX s404 still requires a top-down and risk-based approach that will prevent or detect, on a timely basis, a material misstatement (error or omission) of the financial statements that are filed with the SEC.

Note:

  1. Internal Control over Financial Reporting for SOX s404 only needs to address frauds that might be material (in amount or based on a qualitative factor, in rare cases) to the prudent investor.
  2. Even those only need to be covered by the s404 scope if they would result in a misstatement of the financials filed with the regulator. If the fraud loss is correctly recorded and reported as an operating expense, the financials are not This is expressly stated in Auditing Standard No. 5.
  3. However, it is appropriate to have controls that will prevent or detect lower levels of cyber-related fraud when doing so makes good business sense – in other words when justified by the level of risk to the business. The controls that address the lower levels of risk should not be included in scope for SOX.

 

So, read the report (and the earlier guidance on disclosures) and discuss what it means to your organization. But don’t rush to add non-material cyber-related fraud to your scope for SOX.

 

I welcome your comments.

A basic principle most people don’t understand about risk

October 11, 2018 16 comments

Almost everybody makes a fundamental error when it comes to assessing a risk (what might happen).

It doesn’t matter whether they are using a heat map, a risk register, or a risk profile.

They show the level of risk as a point: the likelihood of a potential impact or consequence.

But 99% of the time this is wrong.

99% of the time, there is a range of potential consequences, each with its own likelihood.

Even if you ignore the fact that there are more often than not multiple consequences from an event, situation, or decision, anybody trying to understand risk and its effect on objectives needs to stop presenting the level of risk as a point.

stop sign

This was brilliantly illustrated in the Ponemon Institute’s latest report on cyber. Their 13th Cost of a Data Breach Study (sponsored by IBM) is an excellent read. It has a number of interesting findings that I will discuss in a separate blog.

The content that is relevant to this discussion is a graphic that shows the range of potential consequences from a cyber breach. Their graphic shows the likelihoods of having anywhere from 10,000 to 100,000 records stolen. (They separately discuss the cost of what they call a ‘mega breach’, when more than a million records are stolen.)

Using their number for the average cost to the business (across all sectors and geographies) of the loss of a single data record, I created the graphic below. (The probabilities are for the next 24 month period.)

cyber range

As you can see, in their estimation a cyber breach can result in a loss that is anywhere from $1.5 million to $14.8 million. (The losses suffered by organizations in the medical sector are about triple that amount). They can extend to $350 million for the very few who have 50 million records stolen.

If this is reality, which point do you select to put on a heat map or risk profile?

If you want people to make intelligent and informed decisions relating to this risk, they have to understand the full picture. That picture starts with a chart that shows the range of potential consequences. Ideally, it shows how they might affect enterprise objectives.

What is an acceptable level of risk? For certain it’s not an ‘amount’, as preached by COSO. I talk about an acceptable likelihood of achieving your objectives.

But let’s just focus on this graphic for now.

 

Is the range of potential consequences and their likelihood acceptable?

Are there any individual points in the range that are unacceptable?

Does it make sense to use techniques like Monet Carlo to replace a chart with a single number?

How do you provide actionable information that enables intelligent and informed business decisions?

 

I welcome your comments.

 

For more of my views on risk management, consider my best-selling books:

Costco reports a material weakness in internal control. But is it really?

October 6, 2018 17 comments

In a news release on October 4th, Costco Wholesale announced its operating results for the 4th quarter and full year ended September 2nd.

In that release, it stated:

While the Company is still completing its assessment of the effectiveness of its internal control over financial reporting as of September 2, 2018, in its upcoming fiscal 2018 Annual Report on Form 10-K, it expects to report a material weakness in internal control. The weakness relates to general information technology controls in the areas of user access and program change-management over certain information technology systems that support the Company’s financial reporting processes. The access issues relate to the extent of privileges afforded users authorized to access company systems. As of the date of this release, there have been no misstatements identified in the financial statements as a result of these deficiencies, and the Company expects to timely file its Form 10-K.

Remediation efforts have begun; the material weakness will not be considered remediated until the applicable controls operate for a sufficient period of time and management has concluded, through testing, that these controls are operating effectively. The Company expects that the remediation of this material weakness will be completed prior to the end of fiscal year 2019.

This information is surprising on many fronts.

 

For a start, it is rare these days for a company to determine that it has a material weakness related to IT general controls (ITGC).

Let me explain why it is rare, and why I personally question whether management got this right.

A material weakness is defined by the PCAOB Auditing Standard No. 5 (now renumbered as AS No. 2201) as:

“.. a deficiency, or a combination of deficiencies, in internal control over financial reporting, such that there is a reasonable possibility that a material misstatement of the company’s annual or interim financial statements will not be prevented or detected on a timely basis”.

Let’s start with what would constitute a material misstatement of Costco’s financials.

Their full year pre-tax net income, according to the release, is just over $4 billion. Materiality is generally 5% of pre-tax net income, which in this case would be $200 million.

It is very hard to envisage a situation where a $200 million error would not be noticed.

To meet the threshold for a material weakness, there has to be “a reasonable possibility” that a $200 million misstatement would not be prevented or detected on a timely basis.

Is there a reasonable possibility that defects in “user access and program change-management” could lead to a $200 million error that is undetected by other controls, such as comparisons of actual to forecast, margin analysis, and so on?

In the early years of SOX compliance, ITGC control failures were among the top sources of material weaknesses (the others were tax treatments and the organization’s knowledge of accounting rules).

But while ITGC control deficiencies continue to be present, it is unusual to see them disclosed as material weaknesses.

The reasons are fairly clear: ITGC deficiencies do not have a direct effect on the financial statements. They simply indicate that the automated controls, or the IT-dependent elements of other controls, may not operate consistently as they should.

Costco has not disclosed any failures of such automated controls or the IT-dependent elements of other controls, and they should if they existed. Neither have they disclosed any accounting errors that flowed from such deficiencies.

If I was on the board of Costco, I would be asking how these control deficiencies might lead to a $200 million misstatement of the annual financials (or a $70 million error in the 4th quarter, when pre-tax net income was $1.4 billion).

It is difficult for me to imagine how that could occur. I may be wrong, but I suspect their audit firm, KPMG, insisted that these deficiencies be categorized as material weaknesses.

Calling these material weaknesses does not seem reasonable to me.

 

What else surprised me?

They are saying that they will have corrected these deficiencies within one year.

Assuming that they truly are material weaknesses, how can it be acceptable to wait a full year to get them fixed?

How can the market rely on their quarterly reports if the system of internal control is deemed ineffective for that period?

I would not accept that as a board member, an investor, or a regulator!

 

Finally, the company concluded in its prior quarterly report that its disclosure controls and procedures (which include its internal control over financial reporting) was effective.

If these were in fact material weaknesses (which I doubt), then the question arises as to when management became aware of them – or should have been aware of them. If that predates the 3rd quarter 10-Q, the company may have a problem.

 

I have to wonder whether companies and their auditors fully understand the principles of SOX compliance and what AS5 actually says!

I teach SOX compliance efficiency and effectiveness to SOX program managers (and their equivalents, such as internal audit management). In my experience, the great majority of companies are doing too much (and the wrong) work and the external audit firms have lost touch with the principles of the top-down and risk-based approach mandated by the PCAOB.

 

I welcome your views.

 

By the way, Costco shares lost 4% following the news release. It is not clear how much should be attributed to the material weakness disclosure.

 

 

The basics of risk management

September 29, 2018 7 comments

I want to congratulate David Hillson (a.k.a. the Risk Doctor) for his video explaining his view of risk management basics.

In Risk management basics: What exactly is it?, he takes less than five minutes to sum up risk management with six questions:

  1. What am I trying to achieve?
  2. What might affect me? Are there things out there in the future that might help or hinder me?
  3. Which of those things that might affect me are the most important?
  4. What should I do about it?
  5. Did it work?
  6. What changed?

He says that “managing risk is one of the most natural things we can do and one of the most important”. I have to agree, although I don’t think we do it as well as we should.

I like his six questions.

David has written 11 books on risk management, which is more than me, and I have to admit that I have not read them. While I suspect that we will not agree on every topic, such as the value of risk appetite statements, his six basic questions are similar to my set.

This is what I have included in the book I am writing now, on making business sense of technology risk.

I like to explain risk management as something every effective manager does:

  • They understand where they are today and where they need to go (their objectives).
  • They understand, as best they can, what might happen as they work towards achieving those objectives. I recommend the expression: “they anticipate what might happen.
  • They consider (or assess) whether that is acceptable. Will they still be able to achieve their objectives, even if they suffer an acceptable level of harm in the process?
  • If either the likelihood of success or the likelihood of great harm is unacceptable, they take action. That action could include not only managing the risk but also changing the strategy or even the objective.

We start in a similar fashion and use plain English rather than risk technobabble. (See Risk Management in Plain English).

But I believe you need to set the right objectives first.

I also believe that rather than assessing risks out of context, you need to consider all the things that might happen and assess whether that totality is acceptable.

In other words, manage success rather than risk and certainly don’t manage one risk at a time.

Beyond that, we seem to be on the same page.

What do you think?

Is this simple approach right? Certainly there is more complexity when assessing the various things that might happen, especially when multiple things might flow from a single decision. But isn’t this a good start?

I welcome your thoughts.

Treating cyber as a business problem

September 23, 2018 10 comments

This post is about wisdom on the one hand and thinking and practices that are less than wise on the other.

I was reading through a 2016 article in the online CSO magazine, CISOs bridge communication gap between technology and risk, when I found these:

Grant Thornton’s Chief Information Security Officer (CISO), said:

“…boards are starting to understand that security is another risk to an organization. It’s not really just an IT issue. The impact that cybersecurity incidents can have on the organization has put it in the same class as other risks to the organization because it can be just as damaging.”

The article also has:

“   at its core, security is an executive level business problem. [James Christensen, vice president of information risk management for Optiv says] “Five years ago that never would have been a part of the conversation, but now the more successful CSOs are doing this.”

Steven Grossman, vice president of strategy and enablement at Bay Dynamics says:

“The goal is to manage security in a more effective way. It’s all about everybody marching to the same drummer. Bringing together all the silos in the business so that there are no silos”.

He also says:

“I need to understand the business goals. I am speaking to them in terms that they are going to understand.”

 

This makes total sense to me.

Cyber risk can only be communicated to leadership in a way that is meaningful and actionable, enabling them to make informed and intelligent decisions, if it is done using business language. To me, that means talking about the potential effect on enterprise objectives.

How else does a CISO help leaders decide between investing in cyber protection, a new product, an acquisition, a marketing initiative, and so on?

 

Now let’s see what EY has to say in Understanding the cybersecurity threat, perspectives from the EY cybersecurity Board summit.

EY does well by citing the National Association of Corporate Directors’ five principles from their Cyber-Risk Oversight: Director’s Handbook series. The first principle is on the right lines:

Directors need to understand and approach cybersecurity as an enterprise-wide risk management issue, not just an IT issue.

I believe that it is not sufficient to talk about an “enterprise risk management issue”. We should be talking about managing the organization for success. Considering what might happen (risk) is part of how you set and then execute on objectives and strategies.

But apparently that this not how the delegates at the EY conference think.

The number two takeaway from the Summit is:

The board’s role is not cybersecurity risk management; it is cybersecurity risk oversight.

No.

The board’s role is to provide oversight of how management achieves objectives.

As I keep repeating:

It’s not about managing risk. It’s about managing the organization for success!

There will be times when the board should tell management to take the cyber risk because the monies it would take to reduce cyber risk further are better spent elsewhere, such as on new product development.

If we believe that cyber is a business risk, then let’s act like it is.

Find a way to assess and talk about cyber risk in a way that enables informed and intelligent decisions that weigh those and other business risks against the rewards for taking risk.

Work with operating management to understand how a breach might affect what they are doing and what they plan to do.

Help them make informed and intelligent strategic and tactical decisions.

I welcome your thoughts.