Archive

Archive for the ‘Risk’ Category

A basic principle most people don’t understand about risk

October 11, 2018 15 comments

Almost everybody makes a fundamental error when it comes to assessing a risk (what might happen).

It doesn’t matter whether they are using a heat map, a risk register, or a risk profile.

They show the level of risk as a point: the likelihood of a potential impact or consequence.

But 99% of the time this is wrong.

99% of the time, there is a range of potential consequences, each with its own likelihood.

Even if you ignore the fact that there are more often than not multiple consequences from an event, situation, or decision, anybody trying to understand risk and its effect on objectives needs to stop presenting the level of risk as a point.

stop sign

This was brilliantly illustrated in the Ponemon Institute’s latest report on cyber. Their 13th Cost of a Data Breach Study (sponsored by IBM) is an excellent read. It has a number of interesting findings that I will discuss in a separate blog.

The content that is relevant to this discussion is a graphic that shows the range of potential consequences from a cyber breach. Their graphic shows the likelihoods of having anywhere from 10,000 to 100,000 records stolen. (They separately discuss the cost of what they call a ‘mega breach’, when more than a million records are stolen.)

Using their number for the average cost to the business (across all sectors and geographies) of the loss of a single data record, I created the graphic below. (The probabilities are for the next 24 month period.)

cyber range

As you can see, in their estimation a cyber breach can result in a loss that is anywhere from $1.5 million to $14.8 million. (The losses suffered by organizations in the medical sector are about triple that amount). They can extend to $350 million for the very few who have 50 million records stolen.

If this is reality, which point do you select to put on a heat map or risk profile?

If you want people to make intelligent and informed decisions relating to this risk, they have to understand the full picture. That picture starts with a chart that shows the range of potential consequences. Ideally, it shows how they might affect enterprise objectives.

What is an acceptable level of risk? For certain it’s not an ‘amount’, as preached by COSO. I talk about an acceptable likelihood of achieving your objectives.

But let’s just focus on this graphic for now.

 

Is the range of potential consequences and their likelihood acceptable?

Are there any individual points in the range that are unacceptable?

Does it make sense to use techniques like Monet Carlo to replace a chart with a single number?

How do you provide actionable information that enables intelligent and informed business decisions?

 

I welcome your comments.

 

For more of my views on risk management, consider my best-selling books:

Advertisements

Costco reports a material weakness in internal control. But is it really?

October 6, 2018 14 comments

In a news release on October 4th, Costco Wholesale announced its operating results for the 4th quarter and full year ended September 2nd.

In that release, it stated:

While the Company is still completing its assessment of the effectiveness of its internal control over financial reporting as of September 2, 2018, in its upcoming fiscal 2018 Annual Report on Form 10-K, it expects to report a material weakness in internal control. The weakness relates to general information technology controls in the areas of user access and program change-management over certain information technology systems that support the Company’s financial reporting processes. The access issues relate to the extent of privileges afforded users authorized to access company systems. As of the date of this release, there have been no misstatements identified in the financial statements as a result of these deficiencies, and the Company expects to timely file its Form 10-K.

Remediation efforts have begun; the material weakness will not be considered remediated until the applicable controls operate for a sufficient period of time and management has concluded, through testing, that these controls are operating effectively. The Company expects that the remediation of this material weakness will be completed prior to the end of fiscal year 2019.

This information is surprising on many fronts.

 

For a start, it is rare these days for a company to determine that it has a material weakness related to IT general controls (ITGC).

Let me explain why it is rare, and why I personally question whether management got this right.

A material weakness is defined by the PCAOB Auditing Standard No. 5 (now renumbered as AS No. 2201) as:

“.. a deficiency, or a combination of deficiencies, in internal control over financial reporting, such that there is a reasonable possibility that a material misstatement of the company’s annual or interim financial statements will not be prevented or detected on a timely basis”.

Let’s start with what would constitute a material misstatement of Costco’s financials.

Their full year pre-tax net income, according to the release, is just over $4 billion. Materiality is generally 5% of pre-tax net income, which in this case would be $200 million.

It is very hard to envisage a situation where a $200 million error would not be noticed.

To meet the threshold for a material weakness, there has to be “a reasonable possibility” that a $200 million misstatement would not be prevented or detected on a timely basis.

Is there a reasonable possibility that defects in “user access and program change-management” could lead to a $200 million error that is undetected by other controls, such as comparisons of actual to forecast, margin analysis, and so on?

In the early years of SOX compliance, ITGC control failures were among the top sources of material weaknesses (the others were tax treatments and the organization’s knowledge of accounting rules).

But while ITGC control deficiencies continue to be present, it is unusual to see them disclosed as material weaknesses.

The reasons are fairly clear: ITGC deficiencies do not have a direct effect on the financial statements. They simply indicate that the automated controls, or the IT-dependent elements of other controls, may not operate consistently as they should.

Costco has not disclosed any failures of such automated controls or the IT-dependent elements of other controls, and they should if they existed. Neither have they disclosed any accounting errors that flowed from such deficiencies.

If I was on the board of Costco, I would be asking how these control deficiencies might lead to a $200 million misstatement of the annual financials (or a $70 million error in the 4th quarter, when pre-tax net income was $1.4 billion).

It is difficult for me to imagine how that could occur. I may be wrong, but I suspect their audit firm, KPMG, insisted that these deficiencies be categorized as material weaknesses.

Calling these material weaknesses does not seem reasonable to me.

 

What else surprised me?

They are saying that they will have corrected these deficiencies within one year.

Assuming that they truly are material weaknesses, how can it be acceptable to wait a full year to get them fixed?

How can the market rely on their quarterly reports if the system of internal control is deemed ineffective for that period?

I would not accept that as a board member, an investor, or a regulator!

 

Finally, the company concluded in its prior quarterly report that its disclosure controls and procedures (which include its internal control over financial reporting) was effective.

If these were in fact material weaknesses (which I doubt), then the question arises as to when management became aware of them – or should have been aware of them. If that predates the 3rd quarter 10-Q, the company may have a problem.

 

I have to wonder whether companies and their auditors fully understand the principles of SOX compliance and what AS5 actually says!

I teach SOX compliance efficiency and effectiveness to SOX program managers (and their equivalents, such as internal audit management). In my experience, the great majority of companies are doing too much (and the wrong) work and the external audit firms have lost touch with the principles of the top-down and risk-based approach mandated by the PCAOB.

 

I welcome your views.

 

By the way, Costco shares lost 4% following the news release. It is not clear how much should be attributed to the material weakness disclosure.

 

 

The basics of risk management

September 29, 2018 7 comments

I want to congratulate David Hillson (a.k.a. the Risk Doctor) for his video explaining his view of risk management basics.

In Risk management basics: What exactly is it?, he takes less than five minutes to sum up risk management with six questions:

  1. What am I trying to achieve?
  2. What might affect me? Are there things out there in the future that might help or hinder me?
  3. Which of those things that might affect me are the most important?
  4. What should I do about it?
  5. Did it work?
  6. What changed?

He says that “managing risk is one of the most natural things we can do and one of the most important”. I have to agree, although I don’t think we do it as well as we should.

I like his six questions.

David has written 11 books on risk management, which is more than me, and I have to admit that I have not read them. While I suspect that we will not agree on every topic, such as the value of risk appetite statements, his six basic questions are similar to my set.

This is what I have included in the book I am writing now, on making business sense of technology risk.

I like to explain risk management as something every effective manager does:

  • They understand where they are today and where they need to go (their objectives).
  • They understand, as best they can, what might happen as they work towards achieving those objectives. I recommend the expression: “they anticipate what might happen.
  • They consider (or assess) whether that is acceptable. Will they still be able to achieve their objectives, even if they suffer an acceptable level of harm in the process?
  • If either the likelihood of success or the likelihood of great harm is unacceptable, they take action. That action could include not only managing the risk but also changing the strategy or even the objective.

We start in a similar fashion and use plain English rather than risk technobabble. (See Risk Management in Plain English).

But I believe you need to set the right objectives first.

I also believe that rather than assessing risks out of context, you need to consider all the things that might happen and assess whether that totality is acceptable.

In other words, manage success rather than risk and certainly don’t manage one risk at a time.

Beyond that, we seem to be on the same page.

What do you think?

Is this simple approach right? Certainly there is more complexity when assessing the various things that might happen, especially when multiple things might flow from a single decision. But isn’t this a good start?

I welcome your thoughts.

Treating cyber as a business problem

September 23, 2018 10 comments

This post is about wisdom on the one hand and thinking and practices that are less than wise on the other.

I was reading through a 2016 article in the online CSO magazine, CISOs bridge communication gap between technology and risk, when I found these:

Grant Thornton’s Chief Information Security Officer (CISO), said:

“…boards are starting to understand that security is another risk to an organization. It’s not really just an IT issue. The impact that cybersecurity incidents can have on the organization has put it in the same class as other risks to the organization because it can be just as damaging.”

The article also has:

“   at its core, security is an executive level business problem. [James Christensen, vice president of information risk management for Optiv says] “Five years ago that never would have been a part of the conversation, but now the more successful CSOs are doing this.”

Steven Grossman, vice president of strategy and enablement at Bay Dynamics says:

“The goal is to manage security in a more effective way. It’s all about everybody marching to the same drummer. Bringing together all the silos in the business so that there are no silos”.

He also says:

“I need to understand the business goals. I am speaking to them in terms that they are going to understand.”

 

This makes total sense to me.

Cyber risk can only be communicated to leadership in a way that is meaningful and actionable, enabling them to make informed and intelligent decisions, if it is done using business language. To me, that means talking about the potential effect on enterprise objectives.

How else does a CISO help leaders decide between investing in cyber protection, a new product, an acquisition, a marketing initiative, and so on?

 

Now let’s see what EY has to say in Understanding the cybersecurity threat, perspectives from the EY cybersecurity Board summit.

EY does well by citing the National Association of Corporate Directors’ five principles from their Cyber-Risk Oversight: Director’s Handbook series. The first principle is on the right lines:

Directors need to understand and approach cybersecurity as an enterprise-wide risk management issue, not just an IT issue.

I believe that it is not sufficient to talk about an “enterprise risk management issue”. We should be talking about managing the organization for success. Considering what might happen (risk) is part of how you set and then execute on objectives and strategies.

But apparently that this not how the delegates at the EY conference think.

The number two takeaway from the Summit is:

The board’s role is not cybersecurity risk management; it is cybersecurity risk oversight.

No.

The board’s role is to provide oversight of how management achieves objectives.

As I keep repeating:

It’s not about managing risk. It’s about managing the organization for success!

There will be times when the board should tell management to take the cyber risk because the monies it would take to reduce cyber risk further are better spent elsewhere, such as on new product development.

If we believe that cyber is a business risk, then let’s act like it is.

Find a way to assess and talk about cyber risk in a way that enables informed and intelligent decisions that weigh those and other business risks against the rewards for taking risk.

Work with operating management to understand how a breach might affect what they are doing and what they plan to do.

Help them make informed and intelligent strategic and tactical decisions.

I welcome your thoughts.

Practitioners in a box

September 14, 2018 11 comments

You know the expression, “think outside the box?”

Well, over the years I have met many risk and audit leaders who did just that.

They came into a new position and formed then led a function that was creative and well-received by top management and the board.

However, they fell in love with their creation.

They thought they had found the answer.

But the world is changing and so are the questions.

What might have been outstanding when established can become barely adequate, if that, over time.

That time may even be as short as a year or two!

What these leaders have done is build a new box around themselves: a box built with the ideas of the past.

Successful leaders are constantly challenging themselves and fixing things even if they are not broken – yet.

They listen to new ideas and techniques, not blindly but with an appropriate level of skepticism and openness.

As you know, I have written here and in my books that the practices of internal audit and risk management need to change.

The practices that worked well in the past don’t help our leaders and the organization to succeed.

The old style of creating and then managing a list of risks, or a static audit plan composed of audits of locations and processes instead of how enterprise risks are managed, needs to be vigorously discarded.

When I speak at conferences around the world, excited auditors and risk practitioners tell me they want to embrace the ideas in Auditing that Matters.

The trouble is their leader lived in a box of his past success.

Is that your world? It sounds claustrophobic to me.

I welcome your thoughts.

Deloitte Internal Audit 3.0 has major flaws

September 7, 2018 12 comments

Earlier this year, Deloitte published Internal Audit 3.0, The future of Internal Audit is now.

It’s great that they are encouraging internal audit departments to change so they can meet modern demands, but their presentation that they are offering something novel and disruptive is way off the mark.

As I read the report, I was almost immediately struck by errors of fact. For example, in Figure 1 on page 2, they show IT auditing as starting around 2010. This is absurd. I was running the IT audit function for a major US corporation in 1981! Indicating that data analytics is a current day development, when it’s a technique that has been used for over 30 years, makes me wonder.

In Figure 2, they show integrated audits and cyber risk starting around 2010 and 2012. Does Deloitte have an alternative set of historical facts?

The authors seem very proud to have come up with the “triad of value that Internal Audit stakeholders now want and need”. That triad of “Assure, Advice, and Anticipate” is nothing new. In fact, the IIA’s Mission Statement (published in 2015) is:

To enhance and protect organizational value by providing risk-based and objective assurance, advice, and insight.

The Core Principles of Effective Internal Auditing (also 2015) include:

  • Provides risk-based assurance.
  • Is insightful, proactive, and future-focused.
  • Promotes organizational improvement.

In three years, Deloitte has come up with new words that, in my opinion, are not as powerful as those the IIA came up with in 2015. (Full disclosure: I was a member of the task force that developed both the Mission and the Core Principles.)

They have replaced “insightful, proactive, and future-focused” (a wonderful set of words, each with great meaning) with “anticipate”.

This is not progress.

When they discuss Assurance, they say:

Assurance on core processes and the truly greatest risks is essential but so is assurance around decision governance, the appropriateness of behaviors within the organization, the effectiveness of the three lines of defense (LoD), and oversight of digital technologies.

I agree with them on assurance related to the risks that matter. I also like the emphasis on decision-making and organizational culture.

But oversight of technology is hardly new, and they don’t seem to understand that when you take an enterprise-risk based approach there is no need to provide separate assurance on individual processes. Audits of how management provides reasonable assurance that the risks that matter are identified, understood, and addressed fully encompass the controls within the processes that manage those risks.

I can’t say more than an emphasis on the 3LoD is absurd. Why call it out? Just focus on the risks that matter and provide proactive and future-focused assurance, advice, and insight.

They also say:

Anticipating risks and assisting the business in understanding risks, and in crafting preventative responses, transforms Internal Audit from being a predominantly backward-looking function that reports on what went wrong to a forward-looking function that prompts awareness of what could go wrong, and what to do about it, before it happens.

This is a management function. Internal Audit should assess whether management has the capability to identify, assess, and address new or changing risks. If they don’t, we can provide advice and insight that will help them upgrade their processes.

They miss the point that insight should refer to the internal auditor sharing more with management than the standard language of the internal audit report. For example, is the manager of the function audited competent and does he treat his employees well? Is there a morale problem?

Then there is this:

Now, what if – using digital assets – core assurance could be automated, significantly reducing the resources needed to cover these traditional, core processes on a more continual basis? Automated core assurance harnesses analytics, robotic process automation (RPA), and artificial intelligence (AI) to monitor controls and flag non-conformance in real time. Combine this with automated reporting, and Internal Audit can communicate non-conformance to the business so they can remediate immediately, rather than only being able to check the controls every few years under a rotational audit plan scenario.

Let me present a contrary view.

  • If digital assets can be deployed to detect non-conformance, they should be used by management as detective controls, not by internal audit (except in rare cases, such as fraud detection).
  • Internal Audit should assess whether management has effective preventative and detective controls in place, not be the control themselves.
  • When Internal Audit uses continuous auditing techniques (which have been advocated for decades), there is a danger that they are not assessing the controls management has in place and therefore are unable to provide an opinion on them.
  • It is quite possible for there to be no errors in the data even though the system of internal control is deficient.
  • This recommendation will support the view of Internal Audit as the corporate police rather than a business partner.

I know some of the Deloitte leaders and don’t understand how they could publish a document like this.

I suggest they read Auditing that Matters (2016).

 

Your thoughts?

Uniting risk management with strategic planning

September 4, 2018 7 comments

Who can argue that the consideration of what might happen (what some refer to as risk) should be part of the strategic planning process?

Objectives and strategies should be set only after thinking carefully about where you are, what is happening around you, and what may happen in the future. They should then be executed on, keeping an eye as you progress on what is happening that may affect the success of your journey.

I much prefer talking about ‘what might happen’ than ‘risk management’, because while the terms should be synonymous, the word ‘risk’ has a negative connotation. Indeed, the practice of risk management is far too often limited to identifying all and only the things that might go wrong and putting them in a list or heat map.

Neither of those (a list of risks or a heat map) helps executives make decisions, including deciding on objectives and strategies and then executing on them.

My good friend, Alex Sidorenko, tells a story I love. He worked with the senior executives to develop a list of the top risks facing a major organization where he was CRO and took it to the CEO for a discussion. The CEO turned his nose up and told Alex that the list wouldn’t change anything he was doing. It wouldn’t help him make decisions and run the company.

Alex returned from this with a resolution to stop focusing on a list of risks (except where required for compliance purposes, when he would do it as cheaply as possible) and focus on what I would call decision support. He works to help people make informed and intelligent decisions.

Now we have an interesting article on this topic by Mike Skorupski, corporate head of ERM at Siemens Games, a renewable energy company in Denmark.

Uniting risk management with strategic planning urges risk practitioners to get more involved in and add more value to the strategy-setting process.

Skorupski sees more in the COSO ERM guidance than I do when it comes to strategy-setting. While I can see that COSO suggests that risks to strategies be identified after objectives and strategies have been established, he reads COSO ERM the way it should have been written: you consider where you are, what is happening, and what might happen before establishing enterprise objectives.

Where I differ from Skorupski is on the focus on the negative.

Objectives and strategies should be set and then managed with an eye on all the things that might happen, both the positive and the negative.

Expert practitioners have tools, like Monte Carlo simulations, that help assess the range of possible future situations and their effects on objectives, and the likelihood of those possible effects.

But, they are only used to using them on calamity management, not on the range of rewards and opportunities.

Do you make decisions by considering only what might go wrong? Or do you also consider what might go well?

Don’t you make decisions after thinking through all the possibilities?

What will management and the board think if the CRO is only telling them about the likelihood of the sky falling?

Chicken Little

Why not help management assess the possibilities of favorable trends in customer spending, an uptick in the economy, or improved pricing by major vendors – using the same methods as they do for potential harms?

 

I welcome your thoughts.