Archive

Posts Tagged ‘ISACA’

The US Supreme Court and risk assessment

October 12, 2023 3 comments

A decision by the US Supreme Court in 1976 is relevant to practitioners for several reasons.

One of the questions before the court was what the word “material” meant[1].

In TSC Industries, Inc. v. Northway, Inc., 426 U.S. 438, they said that a fact is material if there is “a substantial likelihood that the …fact would have been viewed by the reasonable investor as having significantly altered the ‘total mix’ of information made available.” They also said that determinations of materiality require “delicate assessments of the inferences a ‘reasonable shareholder’ would draw from a given set of facts and the significance of those inferences to him ….”

Why is this relevant to risk assessment?

It doesn’t talk about an investor’s decisions being affected by a single fact in isolation.

Instead, it talks about the “total mix” of information.

The SEC refers to the same idea in its new cyber disclosure rule:

The Commission affirmed in the Proposing Release that the materiality standard registrants should apply in evaluating whether a Form 8-K would be triggered under proposed Item 1.05 would be consistent with that set out in the numerous cases addressing materiality in the securities laws, including TSC Industries, Inc. v. Northway, Inc., Basic, Inc. v. Levinson, and Matrixx Initiatives, Inc. v. Siracusano, and likewise with that set forth in 17 CFR 230.405 (“Securities Act Rule 405”) and 17 CFR 240.12b-2 (“Exchange Act Rule 12b-2”).  That is, information is material if “there is a substantial likelihood that a reasonable shareholder would consider it important” in making an investment decision, or if it would have “significantly altered the ‘total mix’ of information made available.”

This is my point:

When an organization’s management or board are making decisions, they should be considering the “total mix” of information about what might happen (i.e., risks and opportunities) rather than one source of risk at a time.

I explain this in my books with examples like this.

The CEO is considering whether a new product is ready and should be released a month earlier than planned.

She consults her direct reports, including the CRO, and is told that:

    • Product quality risk is low to moderate and within acceptable limits.
    • The risk that related marketing will not be effective is low, and the EVP Marketing says she is “cautiously optimistic”.
    • The additional cyber and information security risk that would be created with the new product are also within defined tolerance levels.
    • The risk that the help desk and other support functions will not be able to handle the additional volume of calls and need for support is moderate and acceptable.
    • The risk that our sales staff are not fully trained in the new product is also “on the low side”. Management is confident they can handle it, “as they always do”.
    • While our cash is currently low, the risk to our cash flow that would be posed by the major expenditures needed for the rollout is ‘manageable’.
    • The upside is high, and Sales are 90% confident of achieving or exceeding projected revenue targets.

The CEO considers all of the above (i.e. the ‘total mix’ of information, or what I call ‘the big picture’), not just one source of risk or opportunity, and asks what would change if they deferred release for a month.

She decides that taking the additional risk in the earlier release scenario is not justified by an additional month of revenue, given all the things that might go wrong and the opportunity to mitigate them over the next month.

Even though each of the individual sources of risk is within tolerable levels, the wise business decision is to wait.

The UK’s FRC has proposed changes to the UK’s Corporate Governance Code that requires risk management and systems of internal control to focus on what would be material to the achievement of enterprise objectives.

This again requires considering the total mix of risk and opportunities (the ‘big picture’), not just one source of risk at a time.

When will risk practitioners move from lists of individual risks to helping decision-makers and leaders of the organization see the big picture?

When will they help them see how the total mix of risks and opportunities will affect the achievement of enterprise objectives?

Risk practitioners need to change.

Internal audit practitioners need to start reporting as ineffective risk programs that only provide a list of risks, and don’t help leaders make informed and intelligent decisions.

What do you think?

——————————————————————————————————————–

[1] The definition they arrived at is still used by the regulators when defining “material” for Sarbanes-Oxley compliance and other matters of law and regulation

Reporting to the board and its committees is a source of serious risk

October 9, 2023 3 comments

Both internal audit and risk functions have to be cognizant of one significant fact when it comes to providing their periodic reports to the board (or to an audit or risk committee of the board).

Members of the board are being overwhelmed by a ton of information prior to and then at each board or committee meeting.

Consider this piece from Board Intelligence, Directors say board packs are worsening. Here’s why and how to fix it.

Their key findings (I have changed the order) include:

  • The average board pack is now 213-pages-long (up 22% in three years).
  • Only 28% of directors and governance professionals get value from board papers.
  • 66% must dig to find the key messages within their reports.

When you add the fact (and I have seen this many times) that board members are sent the 213 pages (on average) only a few days (sometimes less) before the meeting, you can see that there’s a crisis: the board is not getting the information they need, when they need it, in a readily consumable and actionable form.

This obviously has serious ramifications for the effectiveness of the board, and possibly the organization as a whole.

Add some additional facts from the report to the toxic mess:

  • …nearly a third find their papers to be a hindrance (30%). In other words: for the first time since the launch of these tools, more board members and governance professionals get negative value from their board papers than positive value.
  • Overall, 80% of board members and governance professionals now score their board packs as “Weak” or “Poor” (up 10% since 2019).

How can the board, or either the audit or risk committee, be effective under these circumstances?

Too often, board time is taken up by reading management reports rather than discussing what they mean.

So what does this mean for practitioners?

  1. We need to limit what we provide in the way of reports to what board members need to know, rather than trying to show how much we have accomplished. It’s not about us, it’s about their effectiveness as a governance body.
  2. Send our reports early, so they have time to consider them. While management may ask that we include our reports in their board package, we should not wait if it appears likely their reporting will be late in arriving. It’s also to our advantage as board members are more likely to pay attention to our short reports when they are delivered separately and ahead of the rest of the herd.
  3. Consider our responsibility to the board and the fact that excessive and delayed board books are a risk to effective governance. Work with the board and top management to set the clear expectation that board books are:
    1. Concise
    2. Strategic
    3. Timely
    4. Easy to absorb
    5. Actionable

One of the audits I have often included in my audit plan is of the controls over the production of the board book, ensuring that the information it includes is reliable as well as timely. Too often, it is aspirational and optimistic rather than supported by facts.

What is your experience?

What is your opinion?

What do you think?

#risk #audit #GRC #IIA #governance #ERM

The reality of risk reporting

October 5, 2023 1 comment

My thanks to Resilience for an image that captures my view of most risk reporting:

Funny cyber image

While this focuses on cyber risk, I think it is appropriate for most risk reporting to the board and top management.

Risk practitioners need to provide decision-makers with the information they need, when they need it, in a form and language that informs and enables intelligent decisions by them – information they understand and can use effectively.

Decision-makers need to understand all the risks (and opportunities) relevant to their decision so they can take the right level of the right risks.

Not one at a time, but together.

They need to be able to balance the potentials for (downside) risk and (upside) reward. That requires that upsides and downsides are assessed similarly, so they can be aggregated and compared.

If decision-makers consider each risk one at a time, especially if separated from the consideration of reward, they will almost certainly make a poor decision.

Information about risk and reward needs to be reported in a way that enables informed and intelligent decisions, so these are 100% not useful for decision-makers:

  • Heat maps that show each source of risk separately and as a point instead of a range.
  • Risk registers and other lists of risks such as risk profiles.
  • Assessments of individual risks as high, medium, or low.
  • Quantification of risks to information assets.

As Carol Williams recently said in a post, practitioners have to deliver what management needs and wants.

ERM needs to be transformed from something you have to do (check the box) to something management wants, because it helps them and the organization succeed.

In my books, I have suggested this risk reporting:

  1. Integrated risk and performance reporting that shows, for each enterprise objective, the likelihood of achieving, under-achieving, or exceeding it – considering the current state and all the things that might happen (risks and opportunities). Boards and CEOs love this.
  2. Reporting of those few risks that merit individual and continuing attention, typically because they (a) are likely to have a major effect, and (b) are likely to affect multiple objectives.
  3. Tailored reporting for each strategic and major tactical decision – the information the decision-maker needs.

What does your risk reporting and communication look like? Does it enable informed and intelligent decisions? Does it provide leaders with useful and timely information on whether enterprise objectives will be achieved? Does it enable decision-makers to take the right level of the right risks for success?

I welcome your thoughts.

More useful information about cyber risk

September 14, 2023 2 comments

A couple of new pieces provide some interesting insights into cyber risk and the effect of the new SEC cyber disclosure rules.

Matt Kelly, formerly of Compliance Week and now the editor of Radical Compliance, shares the news with us each week. One topic he covered at the end of August was A Look at Actual Cyber Disclosures.

He tells us:

…the most contentious part of the SEC’s new cyber disclosure rule is the section requiring companies to disclose “material cybersecurity incidents” within four days of deciding that the incident is material. If we examine what companies have already been disclosing, that might give us all a better sense of the challenges ahead to meet those new and expanded disclosure details.

To answer that question I skimmed through the most recent quarterly filings of S&P 500 firms, looking for any reference to “cybersecurity incident” or “cybersecurity event.” I did indeed find several, so let’s take a look.

The first was:

AmerisourceBergen, the pharmaceutical giant with $238.6 billion in annual sales. Tucked away in the Management Discussion & Analysis of its quarterly report, the company said it suffered a cybersecurity incident at a foreign subsidiary in March 2023. The incident struck a legacy IT platform and disrupted the foreign unit’s operations for roughly two weeks.

AmerisourceBergen didn’t disclose the precise cost of that attack, but it did leave some clues elsewhere in the 10-Q. Specifically, the company said that its costs to recover from the event were a majority of its “Other” expenses listed for both Q2 and the preceding nine months. Those amounts were reported as $2.33 million and $40.54 million, respectively.

Well, do the math. If a majority of those costs were due to the cyber incident, the amount had to be at least $20.3 million, which is 50.1 percent of $40.54 million. (It’s quite possible the actual total was much more than $20.3 million; we don’t know.)

I also looked back to Amerisource’s previous quarterly report for Q1 2023, filed on May 2 of this year, which would have included March 2023 events. It said essentially the same thing: attack at a foreign subsidiary, legacy system knocked down for two weeks, majority of “Other” costs for the quarter.

This disclosure raises some interesting questions about materiality. Clearly a $20.33 million (at least) cybersecurity incident is not quantitatively material to a company with $238.5 billion in revenue — but is it somehow qualitatively material?

Matt questions whether this breach, which cost a significant level of dollars, was material. However, he doesn’t ask the question:

Did the breach result in the company failing to meet its enterprise objectives, such as revenues and profits? Did it have an effect that would potentially change a reasonable investor’s decision about buying/holding/selling their shares?

On August 2, 2023 the company reported:

AmerisourceBergen is updating its outlook for fiscal year 2023. The Company does not provide forward-looking guidance on a GAAP basis, as discussed below in Fiscal Year 2023 Expectations. Adjusted diluted EPS guidance has been raised from the previous range of $11.70 to $11.90 to a range of $11.85 to $11.95.

This was after we were told:

AmerisourceBergen Corporation (NYSE: ABC) today reported that in its fiscal year 2023 third quarter ended June 30, 2023, revenue increased 11.5 percent year-over-year to $66.9 billion. On the basis of U.S. generally accepted accounting principles (GAAP), diluted earnings per share (EPS) was $2.35 for the third quarter of fiscal 2023 compared to $1.92 in the prior year third quarter. Adjusted diluted EPS, which is a non-GAAP financial measure that excludes items described below, increased 11.5 percent to $2.92 in the fiscal third quarter from $2.62 in the prior year third quarter.

I find it hard to believe that the breach was in any way material when it said it was raising revenue projections.

Matt also talked about Ingersoll Rand:

Industrial equipment manufacturer Ingersoll Rand discussed two cybersecurity incidents in its most recent quarterly report: one that had just happened, and another that had happened a while back.

Let’s start with the recent attack. In a section titled “Recent Developments,” Ingersoll had this to say:

On April 27, 2023, the company detected a cybersecurity incident that resulted in a disruption of several of our information technology systems. We immediately launched a thorough investigation with the assistance of external cybersecurity experts to assess and mitigate impacts of the incident. The company proactively took immediate actions to maintain business continuity and to minimize disruption to operations and customers, including isolating systems and implementing workarounds. As a result, we do not expect this incident to have a material impact on our business, results of operations or financial condition. Although an investigation is ongoing, the company is not aware of any confidential customer information having been exfiltrated. If the Company becomes aware of any such information having been exfiltrated, it will make appropriate notifications.

Ingersoll’s other cybersecurity incident peeks out at us from its financial reporting. When the company reported adjusted EBITDA for Q2 2023, it included a $2.2 million adjustment labeled “cybersecurity incident costs.” In a footnote the company further described that item as “non-recoverable costs associated with a cybersecurity event,” whatever that means.

The company reported net income of $180.8 million in the quarter ended June 30, 2023.

Neither breach was even close to being material to an investor. In fact, I wouldn’t think they were material to the executive team or the board.

His final company is Conagra Brands:

Conagra Brands reported a $4.4 million adjustment to earnings for its most recent fiscal year (which ended May 28) under the label “Third-Party Vendor Cybersecurity Incident.”

Its operating profit for Q4 was $433.3 million.

Again, I don’t see this as being material to a reasonable investor.

Matt asks “whether or when a cyber incident might be qualitatively material. For example, was this a mission-critical vendor? Seems possible, since its failure disrupted Conagra’s ability to fulfill customer orders. So how had Conagra assessed its vendors’ cybersecurity? What backup plans were or weren’t in place to activate a backup system?”

Sorry, Matt, a reasonable investor wouldn’t change their investment decision because the backup failed once. If the breach affected “Conagra’s ability to fulfill customer orders” to the extent that it failed to meet the market’s revenue and profit expectations, that would be material.

The impact might have been material according to this report, which said revenues were lower than predicted:

Conagra Brands Inc (CAG) surpassed earnings projections on 7/13/2023 for Q4 2023.

Analyst expectations for Conagra Brands Inc earnings per share (EPS) were at $0.59, with the company surpassing those estimates with a reported EPS of $0.62, leading to a positive surprise of $0.03 per share (5%). Conagra Brands Inc’s earnings were down 5% year-over-year as the firm reported an EPS of $0.65 in its year-ago quarter. The negative annual growth shows the Consumer Defensive company is struggling to find form amid recent economic conditions.

Revenues were downbeat at $3 billion. That is an increase of 2.16% in revenues from the year-ago report and is 0.57% lower than consensus estimates set at $3 billion.

But the company didn’t even mention the breaches in its press release (as reported by Vending Times):

“Our business delivered strong results in fiscal 2023, as we successfully delivered on our priorities to execute inflation-justified pricing, drive gross margin recovery and reduce net leverage while investing to maintain the strength of our brands,” Sean Connolly, president and CEO, said in the press release. “Looking ahead, we anticipate transitioning toward a more normalized operating environment in fiscal 2024 — with easing inflationary pressures and improved supply chain operations — and remain committed to our long-term financial algorithm.”

Doesn’t seem material to me.

X

PwC has shared a few pieces on the new #SEC cyber rules. You can find them all in SEC’s new cyber disclosure rule: How to prepare for disclosures in a new era of transparency.

One of the referenced pieces is Making materiality judgments in cybersecurity incident reporting.

They have some good ideas, but don’t start the way I would – by asking:

  • What information do investors rely on in making their decisions? Is it just revenue and net income, or are they also interested in market share, new product acceptance, etc?
  • What changes in that information would be material to them?

I recently posted a blog and a short YouTube video on this topic that you might find of interest.

X

How is your organization preparing for the new cyber disclosure rules?

Are you involved?

How do you measure internal audit effectiveness?

September 11, 2023 6 comments

I have recently seen a few articles that discuss this issue, including:

X

Before diving into these pieces, I want to recommend the use of a maturity model to assess an internal audit function. While there is huge value in asking our customers to assess our value to them and their success, it is also hugely valuable to assess the structure, process, and staffing of the internal audit department – and I believe a maturity model is the best way. One can be found (mine) in Is your internal audit world-class? A maturity model for internal audit.

Also before considering these articles, I want to make another very important point:

Compliance (or conformance) with the IIA’s Standards is neither an indication nor a guarantee of quality internal audit practices and the delivery of the valuable assurance, advice, and insight our leaders need in driving the organization to success.

I am not saying that we should ignore the Standards or deliberately violate them. They are useful food for thought. I have written elsewhere about their shortcomings and will not repeat them here. (OK, I lie; they provide guidance on auditing risks to a process or auditable entity rather than risks to enterprise objectives (true risk-based auditing) and confuse an audit universe with a risk universe; they fail to talk about insight; they mandate excessive red tape requirements of low value; and more.)

X

Now let’s review the first of these four articles. It’s a good place to start as it discusses metrics that many have used for ages – even though they are totally useless (I’m not pulling any punches today) and even point departments in the wrong direction.

It identifies:

  • Measuring the percentage of the plan completed. In these days of dynamic change, internal audit departments should be updating the audit plan (at the speed of risk and the business) continuously to address the more significant sources of risk to enterprise objectives. 100% completion of an annual audit plan that was developed at the beginning of the year (or even late in the prior year) is an indication of fragile rigidity instead of agility. Almost certainly, audits have been performed of areas that are no longer a significant source of risk, and new or changed sources of risk have been overlooked.
  • Measuring actual hours against budge We need to spend the time necessary to deliver optimal value. If that is not in line with the budget, change the budget. Again, we need to be agile instead of having our heads stuck in the mud of bureaucracy.
  • Measure audit cycle times and issue reports within 15 days of completing fieldwork. Where is the focus on providing the valuable assurance, advice, and insight our customers need when they need it? Take the time to work with management to get the right changes and controls in place, not just to issue a report with recommendations.
  • Measure auditor and technology utilization rates. I will agree there is value in knowing that the team is being efficiently deployed, but why the rush to deploy analytics? When you are agile and auditing controls over the risks of today and tomorrow, you rarely audit the same area annually – and the ROI of single-use analytics can be questionable. (It requires easy and fast-to-use tools, without a significant investment in building the routines.)
  • Put a dollar value on audit recommendations. There are few ideas that will create as much conflict with management! So much better to simply ask management whether IA has helped them do their job. See my questions at the end of this post.
  • Measure the level of automation in recommendations. This is a very strange idea, that including automation in recommendations (setting aside the issue that we should have agreed action items and not recommendations) is a measure of internal audit effectiveness!
  • Tracking the percentage of audit recommendations that are closed. It should be 100% if internal audit is listening to and working with management to agree on the corrective actions where management believes it is in their own best interest to effect the agreed changes. However, the article talks about performing audits to confirm the change has been made – a low value activity when there are so many risks of significance that could be audited instead.
  • Surveys of auditee satisfaction. This is #1, #2, #3, and more – in fact, the only way to measure the value of internal audit is through the eyes of our customers in management and on the board. Surveys are of questionable value, but open discussions and interviews are invaluable.
  • Risk coverage ignores the whole concept of agile, enterprise risk-based auditing.

X

Next is a piece from the staff of IA360 on risk orientation. It quotes Tim Berichon, an analyst with Gartner. (Tim tells me that they just reposted Gartner’s press release.) He says (and I agree with him):

“Risk orientation was the single biggest factor driving high CAE personal effectiveness scores in our study…. CAEs with high-risk orientation improved their personal effectiveness by up to 47%.”

Risk orientation results in the CAE’s audit plan aligning with top enterprise risks, and audit’s risk assessment aligning with other functions. Further, audit recommendations are well-aligned to enterprise risk appetite.

“Everything internal audit does should be oriented to risk… Given the importance of risk orientation, it’s also interesting to note that effective CAEs are more likely to actively participate in enterprise risk management (ERM).”

X

The third article is by an experienced practitioner and thought leader, Hal Garyn. I expect more and he delivers.

I like his ideas, which I will summarize as:

  • Pay attention and prioritize your relationships throughout the organization. This is especially difficult if you are unable to meet people face-to-face.
  • Lean forward to the risks (and opportunities) of tomorrow and be ready to help management navigate them, especially those involving new technologies.
  • Have a business focus and see your value as helping each manager and the business as a whole succeed.
  • Don’t forget the need for root cause analysis. When there is a problem, work with management to fix more than the symptoms.
  • Pay attention to your professional development (even as CAE).
  • Share information as a team.
  • Help your team and your teammates succeed.

X

Richard Chambers has the last word, or at least I am covering his piece last.

I like this part of his blog post:

The longer that I led internal audit departments, the more that I came to realize that efficiency reviews of internal audit shouldn’t wait until the chief financial officer is knocking at the door with pink slips to hand out. Instead, we should maintain a continuous focus on our processes and procedures. There is comfort in having a deep understanding and trust in processes that have been used and perfected over the years. These tried-and-true processes and techniques are taught to new practitioners and offer valuable benchmarks for comparing performance over time. But reliance on this kind of thinking leads us to the punchline of a meme.

Q: How many internal auditors does it take to change a light bulb?

A: It depends. How many did it take last year?

In a dynamic business atmosphere where risks emerge and mature at lightning speed, we as a profession cannot afford this meme mentality. For many internal audit teams, something has to change!

He quotes Toby DeRoche on agile auditing. I have added some language (sorry Toby).

Much has been written in recent years about “Agile Auditing.” My friend Toby DeRoche is one of the foremost experts on the subject, and he defines agile auditing as “… an iterative approach to developing and executing audits, based on a shorter audit lifecycle from assessment to reporting, which focuses on gaining and sharing insights with management related to the most urgent risks in an organization…Agile auditing focuses on the risks that matter most to management right now and will matter tomorrow.”

As Richard indicates, internal auditors must be constantly looking to be more efficient as well as more effective, providing the assurance, advice, and insight the organization needs to succeed, when it needs it. (For an explanation of “assurance” and what it really means, see my latest video.)

They need assurance about the more significant sources of risk (including opportunity) of today and tomorrow.

Any metric must take that into account.

X

We need to make sure we are measuring the value we provide, not how quickly we can issue a report.

The only way to know is to ask the customer and get an honest answer.

Ask them:

  • Are we helping you and your team succeed?
  • Are we wasting any of your time?
  • What can we do better?
  • Is our work so valuable that you would willingly pay for it out of your own budget?
  • Are you thinking of hiring any of the audit staff?

You should also ask yourself:

  • What activities can I eliminate without adversely affecting the value of our work to our customers (i.e., are we Lean)?
  • Do we have the right people to do the work and deliver the quality product our customers need?
  • Are we addressing the risks of yesterday, or the risks of today and tomorrow?
  • Are we working with management or perceived as something other than a partner? Do they embrace our advice and insights?
  • Is their welcome honest and sincere?
  • Do we have the budget we need? If not, why not? Is it because our value is not what we think it is?
  • Have we earned a seat at their table?
  • What can I change today for better value services tomorrow?

I welcome your thoughts.

My books are on sale!

September 7, 2023 2 comments

My goal in writing has always been to challenge traditional thinking and get people to consider how they can move their practices forward.

I am pleased to have seen many of my books get a good reception, but I want to make them more accessible.

With that in mind, I have decided to cut the prices of several books:

You can find all my published books listed at Norman’s Books.

Enjoy!

The “Great Disconnect between InfoSec and the Business”

September 5, 2023 4 comments

A new piece from AuditBoard, Scaling ITRM: The Promise and Challenges of Risk Quantification, has a lot to recommend it.

They talk (correctly) about the Great Disconnect between CISOs and the executive suite. Their failure to understand each other leads to what cyber practitioners see as underfunding cybersecurity.

However, I believe their solution is wrong. There is a far better way.

They capture the problem very well when they say:

As technological transformation creates more business-owned digital assets — which contain massive volumes of sensitive employee, customer, and vendor data — information security teams shoulder the heavy responsibility of maintaining effective and resilient IT risk management programs. However, a common challenge many IT security leaders face is obtaining support from their executive peers for necessary investment in resources to manage critical InfoSec risk areas.

On the business side, executives admit to the difficulty of managing increased IT and security risk exposures in today’s fast-paced environment.

80% of executives surveyed in PwC’s 2022 Global Risk Survey report keeping up with the speed of digital transformation is a significant risk management obstacle.

Businesses that do excel in managing both upside and downside risks have the advantage of agility. Their executives have access to accurate and reliable IT security risk data for making strategic and informed decisions at the right time — in other words, ahead of their competition. The backbone of agile IT risk management is accurate risk analysis and modeling, deployed early at the outset of new business initiatives. Yet, less than 40% of business executives are consulting with risk professionals early on.

On the InfoSec side, IT security leaders experience difficulty communicating the business impact of urgent security risks to their executive colleagues who may not have technical backgrounds. Two predominant reasons for this are: 1) limiting narratives around the “right” way to approach risk quantification, and 2) resource constraints that affect InfoSec’s ability to effectively assess and prioritize security risks.

Without addressing this disconnect between InfoSec and the business, an organization’s IT security risk management efforts will be incomplete, and the organization may remain exposed to risks that can impact key business strategies and objectives.

At its most fundamental level, risk quantification requires connecting risk to a meaningful value that captures its impact to the business.

Where they go wrong is when they say that “Risk quantification defines an IT security risk’s impact on the business in terms of a dollar value”. (Ignoring the fact that the potential impact will be a range and not a point.)

This at least is better than the NIST/ISO/FAIR practice of assessing the “risk to information assets” – pure technobabble.

technobabble

While they are correct that executives and board members will understand dollars, they are incorrect in saying that means they will understand the potential impact of a breach on the business and what they should do about it.

Dollars are not the whole story.

As I have recounted several times in this blog, most recently in Understanding the new SEC rules requiring cybersecurity risk disclosures, the average cost of a breach is very low – around $4.45 million in a recent study.

Will quantifying the potential impact as $4.45 million excite the executives and board members of a billion-dollar company?

Will that lead them to investing another million or more in cyber, or in changing their strategies?

If the quantification comes out as a huge multiple of that number, is it reliable given the studies of actual breach costs? Even then, will it capture their attention or (on the face of it) justify inaction?

So how should risk officers, InfoSec, and cyber practitioners quantify the potential impact of a breach?

The answer is by assessing the risk to the achievement of enterprise objectives! How would a breach affect the likelihood of achieving each of them?

After all, both the ISO 31000 global risk management standard and the COSO ERM framework talk about risk as the effect on (enterprise) objectives.

If you tell the executive team and the board that a breach would put one or more of their objectives, which are how they measure their and the organization’s success, in jeopardy, you will get their attention!

It’s also a way to enable business leaders to compare the level of risk from a breach to the level of risk from a compliance or other failure. They can contrast the value of an investment in cyber to that of an investment in product development, an acquisition, an ERP upgrade, and AI implementation, or a marketing initiative.

Doesn’t that make far more sense?

Won’t that mean more to leaders in management and on the board than a dollar figure?

Isn’t that more in line with what will be required by the new SEC disclosure rules (as explained in that earlier blog post)? They require assessing whether a breach would be material to the investor.

Help leaders understand whether a cyber breach is at least reasonably likely to occur and be material to the achievement of enterprise objectives.

For more on this, see the risk and cyber books shown on the right of this post and in the Norman’s Books tab.

I welcome your thoughts and experiences.

A world-class internal audit function that would not comply with the IIA’s Global Internal Audit Standards (as drafted)

August 30, 2023 3 comments

As the CAE of multiple internal audit departments[1] that were considered world class by members of the board[2], top executives[3], consultants[4], and team members, I would deliberately not comply with the IIA’s Global Internal Audit Standards (as drafted).

Several draft standards are involved, especially:

  1. We would not perform an assessment of risks to the auditable entity that will be audited. We try to focus our scope on the controls at the entity relied upon to address the more significant risks to the objectives of the enterprise (i.e., we often do not address risks to the objectives that are only important to the entity).
  2. We do not include recommendations for each risk and control issue in our audit reports. Instead, we work with management to determine and then report agreed action items. Management believes these are necessary for their own success as well as that of the enterprise, so they get done.
  3. Because the risk is low that the agreed actions will not be completed, we do not formally follow-up and report the status of every action item, and rarely if ever perform a follow-up audit. We discuss significant issues with management in our periodic meetings with them, and that is almost always sufficient.

I posted a video on why I don’t report audit recommendations and some of the viewers found the idea challenging.

I like this comment:

What did you do to foster a collaborative and productive relationship between internal audit and other functions and how fast did you achieve that goal? More often than not audit and other control functions are perceived by the business as the “necessary evil” and this perception may be very deeply ingrained.

There are several keys to building that positive and constructive relationship, including:

  • Believing and living that belief that IA exists to help management and the organization succeed, not to make points at their expense.
  • Valuing our contribution based on our ability to provide the assurance, advice, and insight management and the board need, when they need it, in an easily consumed and actionable fashion. In other words, recommendations have no value. Only action has value.
  • Listening to management and treating them, at all levels, with respect. Not being arrogant but instead being humble because they know their business better (I hope) than we do – while retaining our independence and objectivity and standing up for our position when warranted (and after listening). Taking that extra time, letting them get to know us, and getting to know them.
  • Wanting to work with them instead of appearing to work against them.
  • Not surprising them, especially in front of their boss.
  • Going out of our way to praise when appropriate, and not just criticize. Issue balanced and fair reports.
  • Going out of our way so they don’t look bad when that is neither necessary nor fair.

Another viewer wrote “Don’t the IIA standards require recommendations?”, and my reply was that this was another defect in the GIAS draft!

On LinkedIn, there were another couple of interesting comments.

One expressed disbelief that I never included recommendations because sometimes it is hard to get management to provide an action plan. He said that sometimes “expedience” required the audit report to go out without the action plan. My response was that expedience is not part of my name. I prefer to take the time to sit down and work things out with management – for all the reasons I share in the video.

Another pointed out that some 2/3 of the reports she sees (as a consultant training people in report writing) include recommendations rather than agreed action plans. She said it would require courage to make the change. I disagreed, pointing out that management and the board would both be highly receptive to internal audit taking the time to make sure management owns and takes the right corrective actions.

The one I liked the best said that he had moved to including agreed action items with great success. Another said:

I fully agree with you Norman. It is possible to add value with colloboration. Both auditors and management are valuable assets of the company and they will find the best way to fix the issues if they collaborate and it is important to add the agreed actions in executive summary with the findings. Because the real value is in the agreed actions

So, am I wrong to deliberately fail to comply with GIAS (as drafted)? Does that make me the CAE of a less than world-class internal audit function?

I welcome your thoughts.

====================================================================

[1] Notably at Tosco, Maxtor, and Business Objects

[2] “You help us sleep at night”; “you are a model of an effective internal auditor”

[3] “Internal audit gives us a competitive advantage”; “internal audit makes sure we stay efficient”; “keep it up or you’re fired”

[4] The first internal audit function included by Protiviti in the best practices database

High-performing audit teams

August 28, 2023 3 comments

Jonathan Chapman has written another interesting piece for Wolters Kluwer.

High performing internal audit teams: Accountable and efficient makes several points with which I agree. But there are also some with which I disagree, and he has missed some that I believe are essential.

He identifies “four characteristics that in combination lead to the highest levels of performance.” They are:

  1. Ensure their activity is truly aligned with their organization and the value it is looking to create.
  2. Operate internal audit stakeholder relationship management – all of internal audit working together to the benefit of the entire range of internal audit stakeholders.
  3. Adopt an empowering leadership approach such that employees have the direction, support, and freedom to deliver at the highest standards.
  4. Operate in an accountable and efficient manner – clearly transparent in their performance and the work they are doing to continually improve.

The article referenced above is the last in a series that started with High performance internal audit teams: Business alignment. He makes some good points in the first piece, but fails to talk about enterprise risk-based auditing: focusing your audit plan on the more significant sources of risk to the organization and the achievement of its objectives.

I believe that to be a critical omission.

In his latest article, he says:

High performance internal audit teams look to deliver value beyond traditional audit work. Internal audit should be the catalyst for management action that makes the business more secure and sustainable by having a constant focus on control improvement.

I agree, subject to saying (again) that we should perform enterprise risk-based auditing. Deliver “the action” that is most valuable and relevant to the success of the company rather than tinkering around the edges.

Once we know what the more significant risks are we can think about what types of audit work should be performed (such as assurance vs. consulting), what tools and techniques to use, and which members of the extended audit team (i.e., including service providers and guest auditors) would be the best to deliver the value we seek, and how much budget (hours and dollars) should be allocated.

I like what he says here:

Internal audit functions that do … ‘flash’ audit work are typically not engaged for more than a week. Reporting is often verbal, supported by a one-page memo, and cleared through internal audit sign-offs. Executive leadership values this approach with its pace and immediacy, allowing for quick change in the business and immediate impact. In high performance internal audit functions, auditors also embrace this pace of delivery that focuses on smaller, more targeted scopes.

In this context, high performance internal audit functions will see traditional full-scope audits operating increasingly as a smaller proportion of the overall book of work, completed each year, with these being increasingly accompanied by all the above activities.

We need to seek to deliver the assurance, advice, and insight the organization needs, when it needs it, in an easily consumed and actionable way.

He also says (with my highlights):

High performance internal audit functions have a truly innovative mindset. All of us see in our work the pressure placed on our front-line colleagues to improve both the service or product they deliver with less headcount or overall lower budgets. Can we help them in this? We are in the privileged position of being able to spend time in all parts of our business, reviewing processes and practices and opining on their effectiveness from a control perspective. What would it take to extend this to also offer ideas and thoughts on areas in which they may improve their quality and/or efficiency? Perhaps drawing on the wider view we have around how other parts of the business are operating doing similar activities? Progressive internal audit functions identify simplification opportunities that they regularly share with the business. One high performing team has even gone as far as including a section on their audit report that identifies specific efficiency opportunities, thus encouraging auditors to consistently be on the lookout for additional opportunities as they conduct their work.

Please join me in telling every member of the IIA’s Standards Board that the insights we share with our leaders can be our most valuable product.

There is more to building and maintaining a world-class internal audit function.

In most of my companies, I was able to build a team of high-powered, smart, intelligent, curious, imaginative, dedicated, and business-focused auditors.

If you don’t have the right people, world-class people, and let them run free, you won’t have a world-class internal audit function.

I welcome your thoughts on comments. Jonathan has a lot to say that is worth considering. Agree?

Understanding the new SEC rules requiring cybersecurity risk disclosures

August 24, 2023 3 comments

A number of articles have been written (especially by the accounting firms and a few law firms) about the new SEC rules, but it is always useful to read what the SEC actually said. You can find the link in their press release.

The press release says:

The Securities and Exchange Commission today adopted rules requiring registrants to disclose material cybersecurity incidents they experience and to disclose on an annual basis material information regarding their cybersecurity risk management, strategy, and governance.

“Whether a company loses a factory in a fire — or millions of files in a cybersecurity incident — it may be material to investors,” said SEC Chair Gary Gensler. “Currently, many public companies provide cybersecurity disclosure to investors. I think companies and investors alike, however, would benefit if this disclosure were made in a more consistent, comparable, and decision-useful way. Through helping to ensure that companies disclose material cybersecurity information, today’s rules will benefit investors, companies, and the markets connecting them.”

The new rules will require registrants to disclose on the new Item 1.05 of Form 8-K any cybersecurity incident they determine to be material and to describe the material aspects of the incident’s nature, scope, and timing, as well as its material impact or reasonably likely material impact on the registrant. An Item 1.05 Form 8-K will generally be due four business days after a registrant determines that a cybersecurity incident is material.

The new rules also add Regulation S-K Item 106, which will require registrants to describe their processes, if any, for assessing, identifying, and managing material risks from cybersecurity threats, as well as the material effects or reasonably likely material effects of risks from cybersecurity threats and previous cybersecurity incidents. Item 106 will also require registrants to describe the board of directors’ oversight of risks from cybersecurity threats and management’s role and expertise in assessing and managing material risks from cybersecurity threats. These disclosures will be required in a registrant’s annual report on Form 10-K.

The key word is material, which I highlighted above and will discuss shortly.

The Final Rule repeats and clarifies the above:

The Securities and Exchange Commission (“Commission”) is adopting new rules to enhance and standardize disclosures regarding cybersecurity risk management, strategy, governance, and incidents by public companies that are subject to the reporting requirements of the Securities Exchange Act of 1934.  Specifically, we are adopting amendments to require current disclosure about material cybersecurity incidents.  We are also adopting rules requiring periodic disclosures about a registrant’s processes to assess, identify, and manage material cybersecurity risks, management’s role in assessing and managing material cybersecurity risks, and the board of directors’ oversight of cybersecurity risks.

The Final Rule has a table on page 12 that summarizes the new requirements. (I am only showing the requirements for domestic filers. Those for foreign filers are very similar but separately stated.)

Item Summary Description of the Disclosure Requirement
Regulation S-K Item 106(b) – Risk management and strategy Registrants must describe their processes, if any, for the assessment, identification, and management of material risks from cybersecurity threats, and describe whether any risks from cybersecurity threats have materially affected or are reasonably likely to materially affect their business strategy, results of operations, or financial condition.
Regulation S-K Item 106(c) – Governance Registrants must:

– Describe the board’s oversight of risks from cybersecurity threats.

– Describe management’s role in assessing and managing material risks from cybersecurity threats.

Form 8-K Item 1.05 – Material Cybersecurity Incidents Registrants must disclose any cybersecurity incident they experience that is determined to be material, and describe the material aspects of its:

– Nature, scope, and timing; and

– Impact or reasonably likely impact.

An Item 1.05 Form 8-K must be filed within four business days of determining an incident was material.  A registrant may delay filing as described below, if the United States Attorney General (“Attorney General”) determines immediate disclosure would pose a substantial risk to national security or public safety.

Registrants must amend a prior Item 1.05 Form 8-K to disclose any information called for in Item 1.05(a) that was not determined or was unavailable at the time of the initial Form 8-K filing.

The requirements in the table are explained in more detail later in the SEC document.

“Material” is the key word, and the Final Rule repeats existing guidance when it says (again with my highlights):

The Commission affirmed in the Proposing Release that the materiality standard registrants should apply in evaluating whether a Form 8-K would be triggered under proposed Item 1.05 would be consistent with that set out in the numerous cases addressing materiality in the securities laws, including TSC Industries, Inc. v. Northway, Inc., Basic, Inc. v. Levinson, and Matrixx Initiatives, Inc. v. Siracusano, and likewise with that set forth in 17 CFR 230.405 (“Securities Act Rule 405”) and 17 CFR 240.12b-2 (“Exchange Act Rule 12b-2”).  That is, information is material if “there is a substantial likelihood that a reasonable shareholder would consider it important” in making an investment decision, or if it would have “significantly altered the ‘total mix’ of information made available.”

By the way, this is the same standard that should be applied in assessing materiality and the presence of a material weakness for SOX compliance.

Do the new rules change SOX compliance requirements? No. SOX is about the integrity of the financial statements filed with the SEC, and the new rules are for additional disclosures. They are not subject to the annual external audit, and are not included within the scope for Sarbanes-Oxley Section 404 (SOX) reporting.

Page 61 repeats in different language what is required when disclosing the organization’s cyber risk management program:

As adopted, 17 CFR 229.106(b)(1) (Regulation S-K “Item 106(b)(1)”) requires a description of “the registrant’s processes, if any, for assessing, identifying, and managing material risks from cybersecurity threats in sufficient detail for a reasonable investor to understand those processes.”

The enumerated elements that a registrant should address in its Item 106(b) disclosure, as applicable, are:

  • Whether and how the described cybersecurity processes in Item 106(b) have been integrated into the registrant’s overall risk management system or processes;
  • Whether the registrant engages assessors, consultants, auditors, or other third parties in connection with any such processes; and
  • Whether the registrant has processes to oversee and identify material risks from cybersecurity threats associated with its use of any third-party service provider.

We have also revised the rule text to clarify that the above elements compose a non-exclusive list of disclosures; registrants should additionally disclose whatever information is necessary, based on their facts and circumstances, for a reasonable investor to understand their cybersecurity processes.

Item 106(b)(2) requires a description of “[w]hether any risks from cybersecurity threats, including as a result of any previous cybersecurity incidents, have materially affected or are reasonably likely to materially affect the registrant, including its business strategy, results of operations, or financial condition and if so, how.”

The SEC states:

The rule requires registrants to describe those processes insofar as they relate to material cybersecurity risks.

They also say:

Under Item 106(c)(1) as adopted, registrants must “[d]escribe the board’s oversight of risks from cybersecurity threats,” and, if applicable, “identify any board committee or subcommittee responsible” for such oversight “and describe the processes by which the board or such committee is informed about such risks.”

Item 106(c)(2) directs registrants to consider disclosing the following as part of a description of management’s role in assessing and managing the registrant’s materialrisks from cybersecurity threats:

  • Whether and which management positions or committees are responsible for assessing and managing such risks, and the relevant expertise of such persons or members in such detail as necessary to fully describe the nature of the expertise;
  • The processes by which such persons or committees are informed about and monitor the prevention, detection, mitigation, and remediation of cybersecurity incidents; and
  • Whether such persons or committees report information about such risks to the board of directors or a committee or subcommittee of the board of directors.

While some may disagree, I like the SEC’s determination on whether board members should have specific cybersecurity expertise:

We are persuaded that effective cybersecurity processes are designed and administered largely at the management level, and that directors with broad-based skills in risk management and strategy often effectively oversee management’s efforts without specific subject matter expertise, as they do with other sophisticated technical matters.

I am not an attorney, and organizations should consult one before interpreting the Final Rules and making the required disclosures.

Having said that, I would tackle the issue by making sure that we have determined what would be material to the organization.

To do that, we need to understand what would be material to the reasonable shareholder (or potential shareholder) in making an investment decision.

In general, they are interested in both the financial results for the current period and projected future results.

In other words, they are concerned with whether the organization will achieve its enterprise objectives, which should be consistent with financial and other guidance they have shared with investors.

We might use the measure typically adopted for SOX of 5% of pre-tax net income, but that may be too high. I advise consulting the investor relations function and senior executives to understand what they believe would be material to investors.

For example, a breach might cause a level of business disruption that prevents the company meeting the earnings projections shared with investors, even though the total cost including loss of revenue is less than 5% of that projection.

I am not persuaded that using a loss level to define “material” is realistic.

Every organization should consider its own specific facts and circumstances to define what would be a material breach.

As explained in my earlier post, Excellent Insights into Cyber Risk, the average cost of a breach is only $4.45 million. That is far less than 5% of pre-tax net income.

Some companies have suffered far larger losses, but those are far larger companies. As explained in Understanding the Business Risk that is Cyber:

According to PCH Technologies (an IT consulting firm) in their 2021 report:

  1. Solarwinds, a company that makes business software, was compromised at some point in 2020. This was an advanced persistent threat (APT) that proved very hard to detect. In total, the company reported losses of $25 million to its investors.

Note: Solarwinds’ revenue in 2020 was $1.1 billion, so the losses were 2.27% of revenue.

  1. Amazon was targeted with a DDOS[1] attack earlier… and it succeeded. They were only down for a little over an hour, but the total losses were somewhere in the neighborhood of $75 million.

Note: Amazon’s revenue in 2020 was $386 billion, so the loss was trivial by comparison.

  1. In May of 2021, Brazilian meatpacking company JBS was the victim of a ransomware attack. The ransom alone was $4.4 million, and the loss of revenue might have been even greater.

Note: JBS’s 2020 revenue was $71 billion.

  1. On May 6, 2021, the Colonial Pipeline was hacked, and the ransom paid by the company was reported as $5 million.

Note: this was 1% of Colonial Pipeline’s 2021 revenue of $500 million.

To repeat: Every organization should consider its own specific facts and circumstances to define what would be a material breach.

These days, some shareholders and potential investors are concerned with sustainability measures, so potential deviations from corporate targets might well be material.

I am troubled by the need to assess cyber risk separate from (i.e., not as one of many) other business risks. Cyber risk may be less than material when viewed in isolation, but management and the board need to understand the full range of business risks when considering whether they represent an unacceptable likelihood of achieving objectives. In other words, when they indicate that projected results (which have been shared with investors) are unlikely to be delivered. For example, when revenue projections rely on avoiding product quality issues as well as breaches caused by vulnerabilities in the new products, it’s the presence of multiple sources of risk that can lead to an assessment that those revenue targets may not be met.

While the SEC rules require that the organization disclose “Whether and how the described cybersecurity processes in Item 106(b) have been integrated into the registrant’s overall risk management system or processes”, I don’t think that is sufficient. I would much prefer to see the SEC require a disclosed assessment of the processes for managing all sources of business risk, rather than limiting it to cyber.

Coming back to my theme…

If I were in charge of a project to ensure compliance with the new SEC rules, I would make sure that there is an adequate cyber risk assessment – and that is one that it is based on how a breach might have a material effect on the achievement of enterprise objectives. Everything flows from that determination.

That is the theme of my risk management writing, here and in my several books.

I welcome your thoughts, especially any insights on the impact of the SEC rules and how organizations should and are responding.

[1] Distributed Denial of Service attack

What keeps IT leaders up at night?

August 10, 2023 Leave a comment

This is the title of a new report from Rocket Software. They surveyed 275 U.S.-based IT directors and vice presidents in companies with more than 1,000 employees.

Respondents were asked about their current IT stressors, their approach to modernizing their IT infrastructure, and how they plan to become more efficient and optimized in the years ahead.

What was uncovered was IT leaders focus on reducing risk, increasing efficiency, and optimizing their resources with hybrid cloud solutions.

The priorities reported were:

  1. Improve overall IT performance (60%)
  2. Data security (50%)
  3. Process risk and compliance (46%)
  4. Improve agility (41%)

Interesting (at least to me) was how they said they measured success for their IT function:

  1. Increased efficiency (71%)
  2. Optimized resources (67%)
  3. Reduced risk (63%)

Customer satisfaction was not one of the top three! I am used to seeing IT functions measure, for example, the age of the backlog – another metric not in the top three.

Seven focus areas were identified for this year:

  1. Hybrid cloud (65%)
  2. Data and content management (60%)
  3. DevOps (58%)
  4. Infrastructure and application modernization (58%)
  5. Automation (57%)
  6. Performance management (41%)
  7. Enterprise storage (35%)

There’s a section in IT risk management. After hyping the normal global cyber risk data (so much easier than focusing on each organization’s specific level of risk), the report provides some interesting insights:

  • Only 33% of respondents are extremely confident that they have the right technology/software in place to execute an effective approach to IT risk management.
  • Only 34% of respondents are extremely confident they have the right processes in place to execute an effective approach to IT risk management.
  • Only 28% of respondents are extremely confident they have the right people in place to execute an effective approach to IT risk management.

When I see numbers like this reported by the executives responsible for an area, I have to ask why they are so low. After all, these are the people responsible for getting the right technology, processes, and people!

While 41% are thinking about risk almost every day, they seem to be making other investments their priority.

Is it because they believe other areas, other sources of risk and reward, are better uses for their scarce resources?

Are they right?

Is this, cynic that I am, why so many CISOs seem to want to report to the CEO instead of the CIO – when the CIO is the executive that is most likely to understand cyber risk and how it relates to other enterprise priorities?

There’s more in the report that merits your attention, including discussions of hybrid cloud and DevOps.

What do you think?

  1. Do CIOs have their priorities right?
  2. Are they measuring the right metrics for success?
  3. Is their investment in cyber sufficient, given other demands for those resources?

I welcome your views.

Don’t waste their time!

August 3, 2023 5 comments

I think we can all agree that our top executives are very busy. Their days (and often their evenings and weekends) are consumed by meetings, both virtual and in person.

I once asked a division president where he spent his time (because I wanted to make sure my audit plan addressed the areas of concern to him), and he said that he was usually running from one fire to the next. In fact, he said that the hour or so he spent with me on my periodic visits was the only time he was able to sit back and think about the business as a whole! (I always focused on the business, not the auditing technobabble of “risks and controls”).

In my experience and opinion, based on decades of working with top executives around the world supplemented by my own experience as a vice president in IT, effective executives “manage by exception”.

The principle is that you let your direct reports handle the routine and you spend your time elsewhere. I followed that myself[1], describing it as:

  • Hire the best people.
  • Mentor and guide them as needed, but don’t “manage” them.
  • Trust them and get out of their way. (For example, I was very conscious of the fact that I could be a roadblock if I delayed my review and approval of draft audit reports.)
  • Check in regularly (usually monthly) to confirm things are on track and ask if they need help.
  • Get involved when there are issues of great importance, such as control deficiencies that merit the attention and action of top management, as I had broader and deeper insights into the business as a whole. I also had, usually but not always, better relationships with the top executives.
  • Provide advice or assistance (if advice isn’t sufficient) when they say they need my help, generally not inserting myself when they don’t.
  • Ask questions when there were indications that they are struggling, not to micromanage but to add my advice and guidance.
  • Intervene if they are not performing. On those rare occasions, either their customers or their team members usually alert me.

Just like the top executives, I was busy and couldn’t afford to spend my time on the mundane and routine.

Why should risk and audit practitioners and leaders expect their stakeholders to have the time, let alone the inclination, to read detailed reports that run many pages and talk about the mundane and routine?

Why should these busy people spend their precious time on reports that:

  • Won’t change any strategic decisions they are making or actions they are taking?
  • Can easily be handled by their direct reports (or their direct reports) without intervention by them?
  • Have information that they really don’t need to know. It might be information that the practitioner thinks is important and they want the executive to know, but which doesn’t change anything?
  • Demonstrate only that the practitioner is doing their job?

My advice is:

  • Spend your time on the areas and issues that matter to the success of the organization, the achievement of its objectives.
    • For the risk practitioner, that means helping management navigate those sources of risk that are significant to the achievement of objectives, only considering less important risks to make sure they are not becoming significant (and perhaps that they are being watched by lower levels of management).
    • For the audit practitioner, only include audits in the audit plan, and risks in the scope of those audits, where a control deficiency would threaten the achievement of enterprise objectives[2]. Focus on the issues that matter, providing related and valuable assurance, advice, and insight[3].
  • Only report to the executives what they need to know, when they need to know, in a clear fashion that is easily consumed, understood, and acted upon. If necessary, let them know that additional information on matters of lesser importance is available and that they are being handled lower in their organization.
  • Don’t waste their time. They will resent you for it!

When leaders of the organization know that you only speak when it is important, they will listen. I am reminded of this brilliant EF Hutton commercial from the 1970s.

I welcome your thoughts. Are you or your team wasting people’s time? Do you get their full and complete attention when you speak or send a report, because they know what you have to say is important to them?

See this video, where I tell the story of a CRO that was irrelevant because he wasted the CEO’s time.

====================================================================

[1] When I was in IT, I had a team of about 50 people led by six or so direct reports. As CAE (and CRO), my teams ranged from 3 (all reporting to me) to 50 with 7 direct reports.

[2] In other words, risk-based auditing. See my video.

[3] I will continue to use this phrase even if it is not used in the IIA’s global standards. That is a serious oversight by the IIASB.

Excellent Insights into Cyber Risk

July 31, 2023 1 comment

The annual studies of cyber breaches and their impacts by the Ponemon Institute, sponsored by IBM, are very interesting and I used them in my two related books.

The latest offering is Cost of a Data Breach Report 2023. It is based on reports by 553 organizations of various sizes across 16 countries and geographic regions and 17 industries.

I am sharing some important excerpts and points with my highlights. Later, I will provide some comments of my own.

  • “The average cost of a data breach reached an all-time high in 2023 of USD 4.45 million.”
  • Only 51% said they plan to increase their investments in cyber security.”
  • “The top areas identified for additional investments included incident response (IR) planning and testing, employee training, and threat detection and response technologies.”
  • Security AI and automation were shown to be important investments for reducing costs and minimizing time to identify and contain breaches. Organizations that used these capabilities extensively within their approach experienced, on average, a 108-day shorter time to identify and contain the breach. They also reported USD 1.76 million lower data breach costs compared to organizations that didn’t use security AI and automation capabilities.”
  • Only one-third of companies discovered the data breach through their own security teams, highlighting a need for better threat detection.”
  • “While 63% of respondents said they involved law enforcement, the 37% that didn’t paid 9.6% more and experienced a 33-day longer breach lifecycle.”
  • “Organizations that reported low or no security system complexity experienced an average data breach cost of USD 3.84 million in 2023. Those with high levels of security system complexity reported an average cost of USD 5.28 million, representing an increase of 31.6%.”
  • The sector with the highest average cost of a breach was Healthcare ($10.93m), followed by Financial ($5.90m), Pharma ($4.82m), and Energy ($4.78m).
  • The mean time to identify a breach was 204 days.
  • The mean time to contain an identified breach was 73 days.
  • Lost business costs dropped 8.5%, from USD 1.42 million in 2022 to USD 1.30 million in 2023. Lost business costs include activities such as business disruptions and revenue losses from system downtime, the cost of lost customers and acquiring new customers, and reputation losses and diminished goodwill.”
  • “Phishing and stolen or compromised credentials were responsible for 16% and 15% of breaches, respectively, with phishing moving into the lead spot by a small margin over stolen credentials, which was the most common vector in the 2022 report. Cloud misconfiguration was identified as the initial vector for 11% of attacks, followed by business email compromise at 9%.”
  • “This year, for the first time, the report examined both zero-day (unknown) vulnerabilities as well as known, unpatched vulnerabilities as the source of the data breach and found that more than 5% of the breaches studied originated from known vulnerabilities that had yet to be patched.”
  • “Although relatively rare at 6% of occurrences, attacks initiated by malicious insiders were the costliest, at an average of USD 4.90 million, which is 9.6% higher than the global average cost of USD 4.45 million per data breach. Phishing was the most prevalent attack vector and the second most expensive at USD 4.76 million. Breaches attributed to system error were the least costly, at an average of USD 3.96 million, and the least common, at 5% of occurrences.”
  • “This year, it took nearly 11 months (328 days) to identify and contain data breaches resulting from stolen or compromised credentials, on average, and about 10 months (308 days) to resolve breaches that were initiated by a malicious insider. Those two vectors, along with phishing and business email compromise, were also responsible for the costliest breaches. As a point of comparison, the overall mean time to identify and contain a data breach was 277 days or just over nine months.”
  • Over one-quarter or 27% of breaches were disclosed by the attacker as part of a ransomware attack.”
  • “Respondents required a mean time of 320 days to identify and contain breaches disclosed by the attacker.”
  • At USD 5.13 million, the average cost of a ransomware attack in the 2023 report increased 13% from the average cost of USD 4.54 million in the 2022 report. At USD 5.24 million, the average cost of a destructive attack in the 2023 report also increased 2.3% from the average cost of USD 5.12 million in the 2022 report. Total time to identify and contain a ransomware breach was 11.4% or 33 days shorter with law enforcement involvement, at 273 days in total compared to 306 days.”
  • Organizations that paid the ransom during a ransomware attack achieved only a small difference in total cost, at USD 5.06 million compared to USD 5.17 million, a cost difference of USD 110,000 or 2.2%. However, this calculation doesn’t include the cost of the ransom itself.”
  • “Of the organizations studied, 31% incurred fines as a result of a data breach, and only 20% of those fines exceeded USD 250,000. A fine of USD 250,000 represented 5.6% of the average total cost of a data breach in the 2023 report.”
  • Mega breaches, characterized by more than one million compromised records, are relatively rare. But they exert a powerful impact due to their outsized scope.” When the breach involved 1-10 million records, the average cost was $36m. Then the costs jump considerably: 10-20m, $166m; 20m-30m $225m; 40m-50m $328m; and 50m-60m, $332m.
  • “At only 18% of respondents, insurance protection was the least common investment after a breach.”
  • Only 28% of organizations extensively used security AI and automation tools in their cybersecurity processes, while 33% had limited use. That leaves nearly 4 in 10 relying solely on manual inputs in their security operations.”
  • Respondents from organizations that extensively used security AI and automation were able to identify and contain a breach in 214 days, which was 108 days shorter than those with no use. This means identifying and containing a breach with extensive use of security AI and automation took just 66% of the time it took organizations with no use. Limited use also made a significant impact, with an average time to identify and contain a breach in 234 days, which was 88 days shorter than organizations with no use. It’s clear that even a limited effort to integrate security AI and automation into security workflows can offer a significant acceleration in the time to identify and contain a breach as well as a sizable reduction in costs.”
  • “This year’s research showed that threat intelligence users uncovered breaches in 13.9% less time than those without a threat intelligence investment, a difference of 28 days.”

The authors made four recommendations. Please see the report for the details.

  1. Build security into every stage of software development and deployment—and test regularly.
  2. Modernize data protection across hybrid cloud.
  3. Use security AI and automation to increase speed and accuracy.
  4. Strengthen resiliency by knowing your attack surface and practicing IR (incidence response).

My thoughts:

  1. Cyber risk is not nearly as great as most seem to believe. With an average impact (and it seems the numbers are fully inclusive of business disruption, fines, and more) of just $4.45 million, a breach is highly unlikely to cause an organization to fail to achieve its objectives for the year. With annual salaries for a software engineer (before benefits) in the Silicon Valley at $175,000, that is not a large number.
  2. I continue to be very concerned that organizations don’t know when they have been breached! The mean time to find out is 204 days (with 73 more days to contain it), and the breach is only identified one third of the time by the organization’s own security team.
  3. Based on the above, it is hard to say that cybersecurity risk is one of the top sources of business risk for any organization. There are so many that can have a larger impact, including being late to market with new products. The “mega breaches” are rare and are at healthcare companies, most of which have revenues in the billions. Cyber may be important, but it would be unusual for it to be a top ten risk.
  4. New technologies offer hope that they may enable organizations to detect and therefore contain breaches much faster.
  5. While it is near impossible to prevent a breach, minimizing any harm is essential. That means a focus on detection and response is essential.
  6. A solid business risk assessment is critical to determine how much to invest, and then where to invest. That means setting the ISO, NIST, FAIR and other cyber risk frameworks aside (they only assess risk to information assets). Understand the risk to enterprise objectives first and assess the value of an investment in cyber vs. other risk and opportunity investments. In other words, assess cyber risk in context. Only when the cyber investment decision has been made, use one of those frameworks to determine where in cyber to invest. See Understanding the Business Risk that is Cyber: A guide for both business executives and InfoSec managers to bridge the gap.

I welcome your thoughts (and I hope you find my book valuable).

Have you seen my YouTube channel yet?

Is your culture a healthy one?

July 27, 2023 6 comments

Alexei Sidorenko has followed the trend and provided us with some help with “risk culture”.

I am not a fan of this latest fad.

I think we should focus on whether the culture of the organization is healthy, in that it promotes desired behaviors.

Last year, I wrote Is there an effective risk culture? It centered on the work of Horst Simon, who describes himself as a “Risk Culture Builder”. He defined risk culture:

  • “Risk culture is the system of values and behaviours present in an organization that shapes risk decisions of management and employees. One element of risk culture is a common understanding of an organization and its business purpose” NC State ERM Initiative
  • “Risk culture is a term describing the values, beliefs, knowledge, attitudes and understanding about risk shared by a group of people with a common purpose” Institute of Risk Management[1]

The year before, I covered work by the Institute of Internal Auditors Australia. They have a different set of definitions that I like more.

Culture is a characteristic of a group of people – the shared perceptions about what behaviour is ‘correct’, prioritised and likely to be rewarded. Organisations pursue many different strategic priorities and operate in different political, economic and social contexts, so their cultures vary.

Individual behaviour is affected by the way in which actions are rewarded or punished. In the workplace, people learn what is acceptable behaviour by observing the behaviour (including speech) of peers and managers. Behaviour that is repeated regularly becomes the norm, or ‘the way we do things around here’. Behaviour of managers and leaders is particularly important in demonstrating the priorities of the organisation.

Risk culture is an aspect of broader organisational culture. Risk culture refers to the behavioural norms that help or hinder effective risk management. Some definitions of risk culture also incorporate the group’s underlying values and assumptions about risk management, and others incorporate policies and systems. In large organisations, subcultures often form in different areas and even in specific teams with different managers. Internal audit teams should not assume that risk culture is consistent throughout an organisation, or even within a large division or function or tier of management of that organisation. Culture normally forms in groups of people that have regular interaction with one another, often with a common manager.

Does it make sense to focus on one dimension of culture, that relating to risk? Or should we recognize that there are many dimensions and they may actually be in tension if not conflict with each other – such as compliance, risk, and entrepreneurship.

In a 2018 post, I quoted Deloitte:

Culture matters, because a strong, positive corporate culture provides a framework not only for risk mitigation, but also for both short- and long-term value creation. It aligns values, goals, behaviors, and systems throughout the organization in ways that can have favorable impacts, both internally (for example, through positive employee engagement or by facilitating optimal performance or a strong safety record) and externally (through positive branding, reputation and competitive advantage).

On the other hand, a damaged or broken culture can create dysfunction throughout the organization and create risk to critical assets, including brand reputation, intellectual property, and talent. As recent developments demonstrate, these and other negative impacts can destroy value and, ultimately, the organization itself. An important takeaway from the above is that a strong, positive culture is an important asset of any organization that should be supported and protected. It is not merely a “soft” issue of interest to investors and the media; rather, it can be critical to the company’s growth and performance.

I then asked these ten questions:

  1. How have you defined the culture you want the organization to have?
  2. Does it include all forms of desired (and less desired) behavior?
  3. How have you communicated this to everybody involved in the organization’s success?
  4. How have you ensured everybody understands?
  5. Are there repercussions for unacceptable behavior, even if there is no breach of law?
  6. How do you know whether behaviors across the organization reflect the desired culture?
  7. What is the level of noncompliance, how do you know, and is it acceptable? If not, what are you doing about it?
  8. How often is culture discussed, measured, and who is involved?
  9. Do our employees agree our stated culture is appropriate and is in place? How do you know?
  10. How can you keep us assured of an appropriate culture, especially as the environment changes, including the onboarding of new management and staff, completion of acquisitions, and so on?

When I talk about “defining the culture”, I am talking about the need for the culture to:

  • Encourage teamwork and the sharing of information?
  • Focus on customer satisfaction?
  • Be cognizant of the organization’s reputation and its standing within the community?
  • Be ethical and compliant not only with applicable laws and regulations, but what is right and upholds the values of the organization?
  • Promote the whole organization rather than individual or team success?
  • Be entrepreneurial, creative, and imaginative, rather than stagnant?
  • Respect, and develop every employee?
  • Focus on longer-term success and growth rather than only the short-term?
  • Be willing and able to take the right risks?
  • Act and make decisions at the appropriate speed?
  • Accept and not punish failures?
  • Focus on quality?
  • Have a strong work ethic?

Rather than asking about whether you have the risk culture you want, I prefer to talk about whether the organization’s culture promotes the healthy behaviors you need for continued success.

I also prefer to talk about how people make decisions rather than “risk culture”. When decisions are made:

  • Are they made by the right people?
  • Do they gather and use appropriate and reliable, timely and current information? Do they consult with others, especially others that may be affected by the decision?
  • Are they made at the right speed (right time) given the gravity of the decision, the need to obtain relevant information, etc.?
  • Are they adversely affected by individual or team bias?

Today’s post is complemented by a video where I discuss a number of healthy and unhealthy corporate cultures I have experienced. It helps me explain why I am more sensitive to other indicators of an unhealthy culture.

While risk culture is interesting, and there is some measure of value in understanding how risk attitudes are shaped, there are other dimensions that are usually more important and essential, making the difference between success and failure.

I welcome your comments.

What should practitioners do about AI?

July 24, 2023 1 comment

I am not an expert on AI, although I have toyed with Google’s Bard, Microsoft’s Bing, and OpenAI’s ChatGPT. But clearly it is seen as the next best thing since sliced bread, so the question is what it should mean for risk and audit practitioners.

X

I started by asking Bardwhat is artificial intelligence?” and it replied:

Artificial intelligence (AI) is the ability of a computer or a robot controlled by a computer to do tasks that are usually done by humans because they require human intelligence and discernment. AI research has been highly successful in developing effective techniques for solving a wide range of problems, from game playing to medical diagnosis.

There are many different types of AI, but some of the most common include:

  • Machine learning: This is a type of AI that allows computers to learn from data without being explicitly programmed. Machine learning algorithms are used in a wide variety of applications, such as spam filtering, fraud detection, and image recognition.
  • Natural language processing: This is a type of AI that allows computers to understand and process human language. Natural language processing algorithms are used in a wide variety of applications, such as machine translation, chatbots, and speech recognition.
  • Computer vision: This is a type of AI that allows computers to see and understand the world around them. Computer vision algorithms are used in a wide variety of applications, such as self-driving cars, facial recognition, and medical image analysis.

AI is a rapidly growing field, and there is a lot of excitement about the potential benefits of AI. However, there are also some concerns about the potential risks of AI, such as the possibility of job displacement and the misuse of AI for malicious purposes.

Overall, AI is a powerful tool that has the potential to improve our lives in many ways. However, it is important to be aware of the potential risks of AI and to use it responsibly.

Here are some examples of how AI is being used today:

  • Virtual assistants: Virtual assistants like Amazon Alexa and Apple’s Siri use AI to understand your voice commands and respond to your requests.
  • Self-driving cars: Self-driving cars use AI to navigate the road and avoid obstacles.
  • Fraud detection: AI is used to detect fraudulent transactions in financial services.
  • Medical diagnosis: AI is used to diagnose diseases and recommend treatments.

These are just a few examples of the many ways that AI is being used today. As AI continues to develop, we can expect to see even more innovative and beneficial applications of this technology.

ChaptGPT agrees:

AI stands for “Artificial Intelligence.” It refers to the development of computer systems that can perform tasks that typically require human intelligence, such as learning, reasoning, problem-solving, perception, and language understanding. AI technologies use algorithms and large amounts of data to mimic human-like cognitive abilities, enabling machines to perform complex tasks and make decisions without explicit programming for each step.

Some are now talking about “generative AI”, so I asked Bing what that is:

Generative AI is a category of artificial intelligence algorithms that generate new outputs based on the data they have been trained on. It can be used to create new text, images, video, audio, code or synthetic data. Unlike traditional AI systems that are designed to recognize patterns and make predictions, generative AI creates new content in the form of images, text, audio, and more. Generative AI is also known as generative artificial intelligence or GenAI1.

Generative AI is based on machine learning processes inspired by the inner workings of the human brain, known as neural networks. Training the model involves feeding algorithms large amounts of data, which serves as the foundation for the AI model to learn from. This can consist of text, code, graphics or any other type of content relevant to the task at hand. Once the training data has been collected, the AI model analyzes the patterns and relationships within the data to understand the underlying rules governing the content. The AI model continuously fine-tunes its parameters as it learns, improving its ability to simulate human-generated content

OK, I think I sort of understand all of that.

But what should practitioners do about it?

There are two aspects to the question.

X

The first aspect is to use it ourselves.

Hassam Raza believes that:

AI has the potential to revolutionize the internal audit process. By leveraging AI-powered tools, internal auditors can analyze data more effectively, identify potential risks and opportunities, and improve overall audit efficiency. As organizations continue to adopt AI, it is crucial for internal auditors to develop the skills and knowledge necessary to effectively integrate AI into their audit processes.

Alex Rusate agrees:

Artificial intelligence (AI) has the potential to transform the internal audit profession. ChatGPT, a large language model trained by OpenAI and based on the GPT-3.5 architecture, is an AI tool that can help internal auditors in various phases of their audit work.

They both talk about the possibilities of examining large volumes of data in risk assessment, planning, and in detecting patterns that could indicate fraud. Alex believes that ChatGPT can be used to help write an audit report – which I question.

This all sounds like advanced forms of analytics, which is not a new development. We have been using and talking about it for years. At Business Objects, the marketing department talked about “shining a light” on the business using its analytics tools (then referred to as “business intelligence” software).

I am all for using analytics in auditing and in the risk office. I used analytics of the manual kind when I was leading the audit of Hercules Powder Co. (UK) as a senior auditor with Coopers & Lybrand decades ago; I wrote software to identify trends in enterprise software errors (“abends”) when auditing IBM Long Lines’ Atlanta data center’s general controls; and my team used Business Objects’ own analytics software to detect potential revenue recognition fraud at that company.

So I agree with the sentiment in an article, AI and Internal Audit in the ACCA’s journal, that “Data analytics can enable internal auditors to deliver faster insights into fast-moving risks”. The article tells us:

The Chartered Institute of Internal Auditors (IIA) recently published a report urging internal auditors to fully embrace data analytics, particularly in an age of ‘systemic risk’. The aftermath of the pandemic, the war in Ukraine and now a recession have all magnified and exacerbated a multitude of business-critical risks. According to the IIA, these major risk events are having compounding downstream effects on supply chains, inflation, growth, costs, forex rates, cybersecurity and workplace mental health.

The report, Embracing data analytics: Ensuring internal audit’s relevance in a data-led world, argues that harnessing and embracing the power of data analytics can enable internal auditors to deliver faster and more incisive insights into fast-moving risks so that their boards can then act swiftly.

‘The internal auditor of the future will be fully data-enabled,’ concludes the report. ‘AI and machine learning are being used to detect risks and automate process outcomes testing, strengthening the third line. But computers cannot give a nuanced control design opinion. Internal audit will always require a human touch.’

I have serious reservations about internal auditors using AI to detect errors or other activities of concern. We are not the external auditors. Instead, we should be helping management use these tools as detective controls.

There are going to be cases where our independent use of technology to detect fraud, for example, is justified. But as a general rule, auditors should focus on ensuring that management has the appropriate controls to run the business, rather than acting for management as a detective control.

X

In much the same way, AI is being touted as a magic potion for risk functions. Clayton Scott tells us:

Risk management is a field that has always relied on making informed decisions based on data and experience. With the advent of artificial intelligence (AI), risk management is undergoing a transformation that will change how decisions are made. AI has the potential to revolutionise the way risk managers analyse data, identify trends, and predict outcomes.

With the help of AI, risk managers can now access vast amounts of data to make informed decisions and accurately predict future risks. AI-powered systems can analyse data and identify patterns that would be challenging for humans to recognise. This means that risk managers can take proactive steps to mitigate risk before it becomes a problem

Additionally, AI can help streamline the risk management process by automating repetitive tasks and reducing the risk of human error. With the power of AI, risk managers can make more informed decisions and create a safer, more secure future. In this blog post, we will discuss how AI is transforming the field of risk management and what the future holds for this exciting technology.

X

However, we must be cautious.

Steven Watson warns auditors:

The increasing use of AI in audit comes with several potential risks that need to be considered.

Firstly, there is the risk of error and bias in the data being analysed. AI algorithms rely heavily on data inputs, and if the data is incomplete, inaccurate, or biased, the results produced by the AI could also be inaccurate or biased. This could lead to incorrect conclusions being drawn from the audit, potentially damaging the reputation of the auditor, and harming the client’s business.

Additionally, the use of AI in audit could lead to a reduced level of human oversight and intervention, which could increase the risk of fraudulent activities going unnoticed. If the AI system is not properly designed or implemented, it could miss important red flags or warning signs that a human auditor might have noticed.

Another significant risk associated with the use of AI in audit is cybersecurity. AI systems which are connected to a network or the internet are vulnerable to hacking and cyber-attacks. If an attacker gains access to the AI system, they could manipulate the data inputs or alter the algorithms to produce misleading results. This could cause significant damage to the audit process and could also result in the unauthorised disclosure of sensitive client information.

As such, it is important that auditors take adequate measures to secure their AI systems and ensure that they are not vulnerable to cyber threats. This could include implementing robust authentication procedures, using encryption to protect data, and regularly updating and patching AI software to mitigate known vulnerabilities.

Already, there are concerns about security of personal data when using these tools. For example, the Italian privacy regulator has recently ordered an immediate blocking of access to ChatGPT whilst it investigates the software platform over a range of risks for privacy and cybersecurity. These issues will become more critical over time.

X

If AI is just about a more advanced form of analytics, why all the hoopla?

The hoopla is deserved, because the latest versions of AI are far more than analytics. Going back to Bing:

Generative AI is a category of artificial intelligence algorithms that generate new outputs based on the data they have been trained on. It can be used to create new text, images, video, audio, code or synthetic data. Unlike traditional AI systems that are designed to recognize patterns and make predictions, generative AI creates new content in the form of images, text, audio, and more.

I have another worry.

I asked Bard to say something about me. The first time, it got my background correct. I asked it a second time, and it created different content including a consulting company that is pure fiction.

If we can’t be sure about the integrity of AI results that are used in decision-making, medical diagnosis, or in filing legal briefs with fictitious case citations with a NY federal judge, we should be very worried indeed.

X

Using AI is one aspect. The other is what we should do when our organization uses AI.

The people at McKinsey have shared two great pieces:

They tell us:

  • With proper guardrails in place, generative AI can not only unlock novel use cases for businesses but also speed up, scale, or otherwise improve existing ones. Imagine a customer sales call, for example. A specially trained AI model could suggest upselling opportunities to a salesperson, but until now those were usually based only on static customer data obtained before the start of the call, such as demographics and purchasing patterns. A generative AI tool might suggest upselling opportunities to the salesperson in real time based on the actual content of the conversation, drawing from internal customer data, external market trends, and social media influencer data. At the same time, generative AI could offer a first draft of a sales pitch for the salesperson to adapt and personalize.
  • Companies will also have to assess whether they have the necessary technical expertise, technology and data architecture, operating model, and risk management processes that some of the more transformative implementations of generative AI will require.
  • Generative AI poses a variety of risks. CEOs will want to design their teams and processes to mitigate those risks from the start—not only to meet fast-evolving regulatory requirements but also to protect their business and earn consumers’ digital trust.
  • CEOs should consider exploration of generative AI a must, not a maybe. Generative AI can create value in a wide range of use cases. The economics and technical requirements to start are not prohibitive, while the downside of inaction could be quickly falling behind competitors. Each CEO should work with the executive team to reflect on where and how to play. Some CEOs may decide that generative AI presents a transformative opportunity for their companies, offering a chance to reimagine everything from research and development to marketing and sales to customer operations. Others may choose to start small and scale later. Once the decision is made, there are technical pathways that AI experts can follow to execute the strategy, depending on the use case.
  • Generative AI is developing fast, and companies will have to balance pace and innovation with caution. The board’s role is to constructively challenge the management team to ensure this happens, keeping the organization at the forefront of this latest technological development yet intensely mindful of the risks.

I strongly recommend reading both pieces.

In the same way that we provide consulting advice and assurance on major systems and other implementations, practitioners need to get engaged right at the start when their organization is considering the use of IA.

X

Pulling this together:

  1. Understand what your organization is planning for AI.
  2. If it is not planning any use of AI, find out why not. Are they at risk of lagging the market?
  3. Get involved as consultants to the project leaders, helping them ensure that risks and opportunities are addressed as part of a successful project. For example, can AI be used to improve controls?
  4. Consider the points made by McKinsey about a coordinated approach to the use of AI. If they have a committee set up for that purpose, get on it or at least attend their meetings.
  5. Ensure the board has the information and assurance it needs about the use of AI, its risks and its opportunities.
  6. Help management monitor the use of AI to identify potential problems and errors.
  7. Ensure management has a plan and related processes to respond to any AI-related problems.
  8. Consider the use of AI in your function, but don’t fall in love with it at the expense of using human intelligence. Auditors, for example, really don’t need to monitor and test 100% of transactions; that’s a management responsibility (as a detective control).

I welcome your thoughts.

X

Video Channel: have you seen my latest videos on fraud, critical thinking, follow-up audits, reporting, and SOX?

Everybody is still talking about GRC

July 21, 2023 8 comments

Anthony Pugliese (President and CEO of the IIA) said in a recent webinar that GRC stands for Governance, Risk Management, and Compliance-slash-Control.

I am sure he said that because the IIA has the C meaning Control while the rest of the universe has C for Compliance.

This is just one of the reasons that I say that GRC actually stands for Governance, Risk Management, and Confusion.

Initially, I coined the phrase because every software solution provider that touts a GRC package has different functionalities. Add to that the issue of whether it’s about Control or Compliance.

I also heard about a caller into the SAP help desk who asked about the company’s “GRC products”. The SAP employee asked which ones the caller was interested in, as SAP at that time had a SAP GRC solutions line that included Risk Management, Access Control (often incorrectly referred to as “SAP GRC”), Trade Compliance, and Process Control. They also had a Strategy Management solution that was not included in their GRC line (even though it is fundamental to GRC capabilities – as explained momentarily).

The caller replied, “You know, GRC.”

The employee didn’t know what he wanted, or what GRC meant in practice.

Most people don’t know what GRC means. I don’t mean what the acronym stands for, that is sort of clear. But what does the combination of the three (or so) things mean? Why combine them?

The “most people” include:

  • The people at Tripwire. In their article 5 Things Everyone Needs to Know About GRC, they say that the most common GRC frameworks are:
    • ISO/IEC 27005:2022
    • NIST Risk Management Framework
    • NCSC Risk Management Guidance
    • EU IT Security Risk Management Methodology
    • NIST Cybersecurity Framework
    • NCSC Cyber Assessment Framework Guidance
    • BSI Standard 200-2
    • NIS 2

But these are all cybersecurity and not “GRC” frameworks.

  • Those who call themselves “GRC specialists” but their jobs are in a singular function like risk management, information security, internal audit, or similar – one role rather than having responsibility for all of GRC.
  • The companies that tout their solutions as GRC even though they have missing functionalities like support for the board’s governance and oversight activities, the legal function, cybersecurity, strategy and performance management, and so on.

There is an answer to the question of what GRC means as a combination of functions.

With the help of Michael Rasmussen[1] and others, the Open Compliance and Ethics Group (OCEG[2]) gave us an actionable definition of GRC. You can find it in their GRC Capability Model (they just released version 3.5). I like the earlier version of:

GRC is the capability, or integrated collection of capabilities, that enables an organization to reliably achieve objectives, address uncertainty, and act with integrity; including the governance, assurance and management of performance, risk, and compliance.

See what I wrote about this last year in Upgrade to Effective GRC.

If you are to reliably achieve your objectives:

  • The objectives must be clearly stated and shared with everybody. You don’t want people marching to different drums towards different goals. Enterprise, department, team, and individual objectives and targets must be in sync.
  • Enterprise, department, team, and individual performance assessment and compensation must be in line.
  • Everybody needs to be working together, sharing information and resources to achieve the shared goal.
  • Risks need to be identified, assessed, evaluated, and taken with regard to how they affect the likelihood of achieving objectives.
  • Everybody needs to operate with integrity at all times, with respect for each other, their customers, suppliers, the community, and others.
  • Everybody needs to comply not only with applicable laws and regulations, but with enterprise policies.
  • Silos and fragmented operations (such as the separation of risk assessment and management from strategy and performance management, or the assessment of different sources of risk in different languages and measures) need to be minimized so decision-makers can see the big picture.
  • Technology has to work for everybody, providing a single source of truth.

There is immense value when GRC is effective, and assessing it is an activity every executive, board member, and auditor should consider.

This week, I came across a marketing piece from software vendor Corporater. I have no relationship with them, have not seen their product, have no idea how good it is, and do not endorse their or any other product (I remain independent, although I work from time to time with vendors). While they have unnecessarily added P for performance to the acronym[3], what they say about GPRC makes sense:

Any corporate GRC program must be aligned with business objectives and measured using corporate standards. Failing to do so will most likely render the GRC program ineffective. Vice-versa, any corporate strategy or performance program supported by GRC capabilities will perform better over time. It is critical to enable the identification and monitoring of risks that affect the performance of strategy and business objectives.

While it is refreshing to see the integration of strategy, performance, and risk management, this product (like every other that I can see or imagine) is not complete with all GRC functionalities. Frankly, that would be impossible.

As I said in my earlier post, GRC involves almost every aspect of the organization, including:

  • Strategy management
  • Performance management
  • Board operations, including the board package
  • Legal, including case management
  • Risk management, including opportunities
  • Compliance (and there are a great many compliance requirements, from tax to human capital to customs to environmental and more)
  • Internal audit
  • Treasury
  • Finance
  • Marketing
  • Sales
  • Product development
  • Engineering
  • Operations
  • Quality management
  • Safety
  • Loss investigations and corporate security
  • And much more

Let me see if I can pierce the fog of GRC confusion.

My advice.

  1. If you want to talk about GRC, use the OCEG language and meaning.
  2. If you want to optimize GRC as so defined, assess it first. Consider the questions in How Good is your GRC?: Twelve Questions to Guide Executives, Boards, and Practitioners.
  3. If you want to talk about your responsibilities within the organization for risk management, compliance, information security, or something else, talk about those responsibilities and not GRC.
  4. If you are thinking about acquiring “GRC” software, make sure everything flows from strategy and objectives. Get the solutions that meet your specific needs rather than what is touted as the best “GRC” product. Recognize that each has a different set of functionalities. Don’t think you must have a single, integrated platform; identify where information needs to be shared between functionalities and the value of that capability; contrast that with the value and cost of multiple solutions that are not tightly integrated on a single platform.
  5. Understand what is holding your organization back from reliably achieving objectives, considering what might happen (risk), and acting with integrity.

I welcome your thoughts.

====================================================================

[1] Michael calls himself the GRC Pundit and the Father of GRC, and his work is worth following on his website and social media.

[2] Michael, Brian Barnier, and I were the first three OCEG Fellows. While that is not a paid honor, I cherish it. It was awarded to me for my GRC thought leadership.

[3] I recognize that it makes them look different and has value from a Marketing perspective.

What is the state of risk management?

July 17, 2023 7 comments

Professors Mark Beasley and Bruce Branson of the North Carolina State University’s Poole College of Management have shared with us for 14 years the results of their surveys of risk management practices.

Their latest report, 2023 the State of Risk Oversight: An Overview of Enterprise Risk Management Practices, is based on the responses of 454 people. 21% are CFOs, 12% are CROs, 10% lead the internal audit function, and 7% have the title of Controller. 25% are from small organizations with annual revenues of $10 million or less. 130 (32%) are with companies of $1 billion or more in revenue, and 121 (30%) are at not-for profit organizations.

While I would like to think otherwise, I don’t expect small companies to make significant investments in specialist risk management staff or systems. Instead, they rely on operating management to understand risks and opportunities facing the organization and make related informed and intelligent decisions.

Before reviewing the results of the survey, it is useful to consider the questions that Mark and Bruce have posed to the respondents. (They are in Appendix B.)

  • Every question is focused on potential events or situations that might have a negative effect on the business. None are about understanding and considering positive events or situations, opportunities.
  • There are no questions about whether risk management provides the information necessary for informed and intelligent business decisions, taking the right level of the right risks for success every day.
  • There are no questions about seeing the big picture, where all the risks and opportunities are identified and considered together.
  • There are no questions about how risks and opportunities might affect the achievement of objectives, even though risk is defined (explicitly by ISO and implicitly by COSO) as the effect on the achievement of objectives.
  • While there are suggested questions (although not in the survey) about the consideration of risk in strategic decision-making, the need for it in tactical decision-making is absent. Risk is taken every day, not monthly, quarterly, or (heaven forbid) annually.

So the bar they have set for assessing the maturity of risk management practices is very low.

I have asked them what they consider “mature” risk management, and have suggested talking about it, but they have not responded to my outreach. I have also searched their online library for related articles and only found one – which I was pleased to see referenced a piece I wrote for the IIA’s magazine a dozen years ago! You can find my risk management maturity model in Risk Management for Success.

Even this low bar is not being achieved according to the study:

  • While two-thirds of respondents describe the volume and complexity of risks as higher than prior levels, less than one-third describe their risk management processes as mature or robust. That suggests a disconnect between risk management capabilities and needs.
    • Just 10% of the largest organizations described their risk management as “robust”, and only 7% of those in financial services did so.
    • 2% described their risk management as providing a competitive advantage.
  • Many leaders believe risk management is a distraction. The “tone at the top” may not be sufficiently embracing the value and relevance of risk management in the context of the organization’s strategic success. Many tend to view risk management as bureaucratic and non-value adding.
  • While executives appreciate the reality that risk and return are interconnected, most respondents do not view their organization’s risk management efforts as providing strategic insight.
  • Most organizations report risks to the board on an annual rather than a quarterly or more frequent basis, despite the ever-changing nature of the global risk environment. Only one-quarter of respondents believe risk information generated by the organization’s ERM process is formally discussed by the full board of directors when it discusses the strategic plan. Rich insights about the interconnected nature of risks and their impact on the strategy of the organization should be a primary and regular input to overall board discussions and governance.

What does this mean?

Even at a low bar, where the only management is of adverse risks rather than the achievement of objectives considering all sources of risk (and opportunity), executives are not seeing the value to them and to the organization of risk management – the way they see it practiced. They are not investing, and risk management is immature.

My guess is that those who have more robust systems and processes are being forced to invest in risk management by their regulators. There will be exceptions, of course, but there are clearly few of them.

I call again for academics (including Bruce and Mark), consultants, and leading practitioners to recognize that traditional risk management (focusing on avoiding failure) is not only the path to that very failure but will never be recognized by executives as contributing to success.

For example, the recent publication of the Chartered Professional Accountants in Canada, Risk oversight: A framework for identifying, understanding and addressing risk, also discusses only the oversight of adverse events and situations.

If leaders don’t see the value of risk management in helping them make the right decisions for success they won’t invest in it. I repeat this from the NC State report:

Many leaders believe risk management is a distraction. The “tone at the top” may not be sufficiently embracing the value and relevance of risk management in the context of the organization’s strategic success. Many tend to view risk management as bureaucratic and non-value adding.

Surveys like this and guidance from others should be changed to assess and promote true risk management maturity, not failure management. We need to work together to change the expectations for effective risk management.

I welcome your thoughts.

By the way, I now have a YouTube channel with a bunch of videos that you may enjoy.

Let my people think!

July 13, 2023 2 comments

I congratulate Richard Chambers for his recent article, Is a Crucial Internal Audit Skill in Decline?

The need for practitioners of any color to think has been a topic of mine for years. This week I posted a video on my YouTube channel, “Thinking is a Critical Skill”.

Richard reports:

For more than a decade, chief audit executives have consistently rated critical thinking as one of the top three skills sought when recruiting internal audit staff. In fact, critical thinking often ranks higher than accounting, risk management or even technology skills.

CAEs see critical thinking as the prized attribute that can separate good internal auditors from great ones.

He shares seven recommendations from Dr. Tara Well, associate professor of psychology at Barnard College of Columbia University, which I recommend reading and giving thoughtful[1] consideration.

Here are my own recommendations, echoing some of what I say in the video:

For yourself:

  • Take the time to think. I have a picture in my head of Michael Brooding of my Tosco team with his head down on his desk, appearing to sleep but in reality thinking about what he has just been told and has seen.
  • Use travel time to think.
  • Don’t automatically do what you did last time, even (especially) if it was successful. Think about it first. Challenge yourself.
  • Use your imagination.
  • Make sure you understand why things are done the way they are being done. Are they being done just because somebody told them to (even the auditors)? Think about it. Are the right things being done the right way?
  • Don’t do something just because it’s in the audit program (get rid of standard audit programs or checklists as they constrict the mind), it’s in a book, it’s deemed “best practice”, or because you have been told to do it. It may be wrong (out-of-date, for example), you may not understand the instruction, there may be a better way, and you will not perform with excellence if you don’t understand why it needs to be done. Ask questions and think about the answers.
  • Get out into the business. Look and Listen. Talk to the people without “auditing them” and listen carefully to what they tell you. Think about it and gain a real-life understanding of the business and how it’s run. Understand the business and what needs to be done before trying to audit it.
  • Read industry periodicals and think about how what you are reading applies to your organization. Discuss interesting ideas with management and your peers. Think about it.
  • Don’t automatically rate findings or assessments using a rating system. Instead, think about how what you have seen might affect the success of the organization and the achievement of its goals and objectives.
  • Think about what you would do differently if you were in charge (of internal audit or of the function being audited).
  • Network, listen, and learn what others are doing. Think about it and whether those practices should be adopted (perhaps after modification) by you or your organization.
  • When you see an issue, be of an open mind. Talk to management and listen carefully and actively. Then think about it and make up your own mind.
  • Take classes on critical thinking, practice thinking exercises that stimulate the mind. Watch Ted Talks and think about what is said. Play chess, bridge, Sudoku, etc.

For your team:

  • Give your people the right to think, the time to think.
  • Encourage them to challenge you and your thinking.
  • Listen to them with respect and think about what they are saying.
  • Ask questions even if you know the “correct” answer. You may be pleasantly surprised.
  • If they have a question, don’t give them the answer. Help them find it themselves.
  • Assign tasks that will stretch their abilities, making them think about how to do them and why.
  • Tell people what the goal is and encourage them to suggest the best way to accomplish it.
  • Don’t give them an audit program or checklist. Have them suggest how the work should be done.
  • When there is a problem, ask them what they would do if they were responsible for the function.
  • Don’t accept “I don’t know”.
  • Give your people time to get out and understand the business. Encourage them to participate in company events and projects.
  • Trust your people until they prove you cannot – and then find new positions for them.
  • Give them the training they need.

As I say in the video, while it may be fascinating to talk about the new technologies like AI (Richard uses ChatGPT, and I have used Bard and others), the old skills like Thinking are far more important.

We are given three wonderful auditing tools:

  • Our eyes
  • Our ears
  • And the space in between

Use them and develop them.

Please think about this. I welcome your comments and ideas.

PS – if you like this blog or the YouTube channel, please subscribe.

[1] Pun intended

Is your risk appetite statement of value?

July 3, 2023 5 comments

A few days ago, Lose Luiz Valentim commented on my post, Auditors need to understand risk management.  He pointed out that banks and other financial institutions are required by their regulators to have a risk appetite statement, and asked my views on assessing their effectiveness.

I replied with a quick suggestion, but this is a topic that merits more.

Much more (this is a long post) because this is a very important topic – for risk practitioners, executive management, board members, and auditors.

I will cover some background on risk appetite before talking about how to determine whether your statements add value in practice rather than checking the regulator’s box.

====================================================================

My most read post ever is on the topic of risk appetite. More than 89,000 have viewed Just what is risk appetite and how does it differ from risk tolerance? In it, I said (I will highlight throughout this post excerpts I find especially useful):

How can we have a productive conversation about risk management unless we use the same language? One of the terms that serves as much to confuse as clarify is “risk appetite’. What does it mean, and how does it differ from risk tolerance?

Let’s look first at the COSO ERM Framework. It defines risk appetite as “the amount of risk, on a broad level, an organization is willing to accept in pursuit of stakeholder value.” In their Strengthening Enterprise Risk Management for Strategic Advantage, COSO says:

“An entity should also consider its risk tolerances, which are levels of variation the entity is willing to accept around specific objectives. Frequently, the terms risk appetite and risk tolerance are used interchangeably, although they represent related, but different concepts. Risk appetite is a broadbased description of the desired level of risk that an entity will take in pursuit of its mission. Risk tolerance reflects the acceptable variation in outcomes related to specific performance measures linked to objectives the entity seeks to achieve.”

They continue:

“So to determine risk tolerances, an entity needs to look at outcome measures of its key objectives, such as revenue growth, market share, customer satisfaction, or earnings per share, and consider what range of outcomes above and below the target would be acceptable. For example, an entity that has set a target of a customer satisfaction rating of 90% may tolerate a range of outcomes between 88% and 95%. This entity would not have an appetite for risks that could put its performance levels below 88%.”

Does this work? To a degree, perhaps. The way I look at it, risk appetite or tolerance are devices I use to determine whether the risk level is acceptable or not. I want to make sure I take enough, as well as ensure I am not taking too much. This is all within the context of achieving the organization’s objectives.

In 2020, COSO gave us Risk Appetite – Critical to Success; Using Risk Appetite to Thrive in a Changing World.

The authors, Frank Martens and Larry Rittenberg, tell us:

At its core, risk appetite is critical to organizational success. Articulating risk appetite for your organization will provide board members and senior management with important insight.

The COSO Enterprise Risk Management—Integrating with Strategy and Performance defines risk appetite as:

The types and amount of risk, on a broad level, an organization is willing to accept in pursuit of value. Inherent in this definition are several key points.

Risk appetite:

  • Is intentionally broad to apply across an organization, recognizing that it may differ within various parts of the organization while remaining relevant in changing business conditions.
  • Focuses on risk that needs to be taken to pursue strategies that enhance long-term success.
  • Recognizes that risk is more than individual decisions.
  • Links to value—it is tied to the choices the organization makes on how it creates and preserves value.

Risk appetite is at the heart of decision-making. It is equally important in determining that a decision is necessary.

Every organization must accept that taking risks to innovate and grow is inherent to business. To not do so leaves the organization vulnerable to losing ground to other competitive organizations. The challenge is to know the right amount of risk necessary to sustain innovation and growth across the organization. With that knowledge, an organization can determine which strategies to adopt and which objectives to pursue.

My good friend, Alexei Sidorenko has shared a free booklet on risk appetite. I will come back to that later.

====================================================================

Now let’s examine some risk appetite statements from leading organizations.

The United States Agency for International Development said in 2018:

We have a MEDIUM risk appetite with regard to: Implementing long-term strategic focus in our country programs. We will set priorities and implement long-term strategic focus in our country programs based on rigorous analysis and collaboration with key stakeholders to achieve more effective results. We will also continually balance this with our obligation to implement initiatives, directives and/or priorities from Congress and the interagency not foreseen during the strategy development process.

This is from the Office of the Comptroller of the Currency (OCC)

The OCC has no appetite for unauthorized access to systems and confidential data, and will maintain strong controls to mitigate external threats against its technology infrastructure. The OCC has a low appetite for losing continuity of business operations stemming from unreliable telecommunications or system availability. Business resiliency planning and execution must be aligned with strategic objectives. The OCC has a moderate appetite for innovative technology solutions to meet user demands in a rapidly changing environment. The agency will exercise appropriate governance and discipline when considering and adopting new technology.

Network Rail has defined its risk appetite statements as follows:

Network Rail has no appetite for safety risk exposure that could result in injury or loss of life to public, passengers and workforce. Safety drives all major decisions in the organisation. All safety targets are met and improved year-on-year.

In the pursuit of its objectives, Network Rail is willing to accept, in some circumstances, risks that may result in some financial loss or exposure including a small chance of breach of the loan limit. It will not pursue additional income generating or cost saving initiatives unless returns are probable.

The company will only tolerate low-to-moderate gross exposure to delivery of operational performance targets including network reliability and capacity and asset condition, disaster recovery and succession planning, breakdown in information systems or information integrity.

The company wants to be seen as best in class and respected across industry. It will not accept any negative impact on reputation with any of its key stakeholders, and will only tolerate minimum exposure ie, minor negative media coverage, no impact on employees, and no political impacts.

Are any of these useful standards or guidance that help an organization take the right level of risk risks to achieve its objectives? Remember that risk is taken or modified by every business decision, so to be effective a risk appetite statement should guide each of those decisions.

Do the words “moderate”, “low”, “no”, “minimum”, etc. mean anything? How do they influence decision-making or the setting of strategies?

====================================================================

Time for some more guidance.

The Financial Reporting Council (FRC) report, Boards and Risk: A summary of discussions with companies, investors and advisers published by the FRC in September 2011, tells us:

There were differing views about whether it was either necessary or possible for the board to apply a single, aggregate risk appetite for the company as a whole, as opposed to having a clear view on its appetite or tolerance for individual risks. Many participants felt this was difficult, not least because of the difficulty of quantifying many of these risks and the company’s limited ability to mitigate a number of them, including external risks. A view was expressed that it was even more difficult for non-financial companies than for financial companies, particularly companies or groups operating across different sectors and markets, given the diverse nature of the risks they were dealing with. It was also noted that risk appetite can vary over time.

Some participants felt that all that could realistically be expected of the board was to have a clear understanding of the company’s overall exposure to risk, and how this might change as a result of changes in the strategy and operating environment. When developing the strategy, however, it was important for boards to agree their appetite or tolerance for individual key risks. At its simplest, it was suggested this could be done by articulating what types of risk were acceptable and what were not.

Where boards had set their risk appetite or tolerance for individual risks, some companies also compared the net and gross risks to the ‘target risk’, so that the Board could judge how close the company’s current exposure was to that which it considered acceptable.

The importance of ensuring that incentives were aligned with company strategy and risk appetite or tolerance to promote an appropriate culture was widely recognised. There were different views on the extent to which companies had succeeded in achieving this alignment.

Participants from companies said that in their experience most investors rarely asked questions about risk or internal control. There was a general wariness about disclosing commercially sensitive information or information that, if disclosed, might bring about the very risks the company was seeking to avoid. Reporting on the company risk appetite was felt to be difficult as risk appetite was not constant but varied over time and depending on market conditions, if it could be defined at all. The same could be said about the overall exposure to risk. However, some directors and risk managers accepted there was a need to find ways of conveying more useful information.

Also in 2011, another friend (Richard Anderson) led the development of the Institute of Risk Management report ‘Risk Appetite and Tolerance’. While I disagreed (as a reviewer of the draft) with much of the content of the report, this was useful:

  1. Risk appetite can be complex. Excessive simplicity, while superficially attractive, leads to dangerous waters: far better to acknowledge the complexity and deal with it, rather than ignoring it.
  2. Risk appetite needs to be measurable. Otherwise there is a risk that any statements become empty and vacuous. We are not promoting any individual measurement approach but fundamentally it is important that directors should understand how their performance drivers are impacted by risk. Shareholder value may be an appropriate starting point for some private organisations, stakeholder value or ‘Economic Value Added’ may be appropriate for others. We also anticipate more use of key risk indicators and key control indicators which should be readily available inside or from outside the organisation. Relevant and accurate data is vital for this process and we urge directors to ensure that there is the same level of data governance over these indicators as there would be over routine accounting data.
  3. Risk appetite is not a single, fixed concept. There will be a range of appetites for different risks which need to align and these appetites may well vary over time: the temporal aspect of risk appetite is a key attribute to this whole development.
  4. Risk appetite should be developed in the context of an organisation’s risk management capability, which is a function of risk capacity and risk management maturity. Risk management remains an emerging discipline and some organisations, irrespective of size or complexity, do it much better than others. This is in part due to their risk management culture (a subset of the overall culture), partly due to their systems and processes, and partly due to the nature of their business. However, until an organisation has a clear view of both its risk capacity and its risk management maturity it cannot be clear as to what approach would work or how it should be implemented.
  5. Risk appetite must take into account differing views at a strategic, tactical and operational level. In other words, while the UK Corporate Governance Code envisages a strategic view of risk appetite, in fact risk appetite needs to be addressed throughout the organisation for it to make any practical sense.
  6. Risk appetite must be integrated with the control culture of the organisation. Our framework explores this by looking at both the propensity to take risk and the propensity to exercise control. The framework promotes the idea that the strategic level is proportionately more about risk taking than exercising control, while at the operational level the proportions are broadly reversed. Clearly the relative proportions will depend on the organisation itself, the nature of the risks it faces and the regulatory environment within which it operates.

====================================================================

This is all great theory, but do risk appetite statements help decision-makers in practice?

A lot of this theory is about board members and executive leaders seeing whether the overall level or amount of risk (using COSO language) is acceptable.

But can you really aggregate these into one “amount of risk”? Remember that the likelihood of two discrete events or situations happening is less than the product of their individual likelihoods, and that the level of risk is a range of potential effects and not a single point.

  • Credit risk
  • Safety risk
  • Compliance risk
  • Cyber risk
  • Operational risk
  • Cash flow risk
  • Supply chain risk
  • Third party risk
  • Etc.

Not if you want a number that means something, and many of the participants in the FRC report agreed.

In any event, all you can do is calculate the sum (good luck with that) and compare it with some number somebody has previously calculated as acceptable.

I think we can discard the concept that there is a single “amount of risk”.

Now let’s examine again the examples above.

The United States Agency for International Development talked about having “a MEDIUM risk appetite”.

Sorry, that is fluff that means nothing.

The OCC said it:

“…has no appetite for unauthorized access to systems and confidential data.”

“…has a low appetite for losing continuity of business operations stemming from unreliable telecommunications or system availability.”“…has a moderate appetite for innovative technology solutions to meet user demands in a rapidly changing environment.”

How does anybody use that to guide decision-making?

If there is no appetite, that means you need the risk to be zero – and that is impossible without shutting down the organization.

Network Rail also fluffs around:

Network Rail has no appetite for safety risk exposure that could result in injury or loss of life to public, passengers and workforce.

In the pursuit of its objectives, Network Rail is willing to accept, in some circumstances, risks that may result in some financial loss or exposure including a small chance of breach of the loan limit. It will not pursue additional income generating or cost saving initiatives unless returns are probable.

The company will only tolerate low-to-moderate gross exposure to delivery of operational performance targets including network reliability and capacity and asset condition, disaster recovery and succession planning, breakdown in information systems or information integrity.

The company …will not accept any negative impact on reputation with any of its key stakeholders, and will only tolerate minimum.

I recognize that the regulators in some industries, and even some corporate governance codes, require a risk appetite statement.

But fluff only checks the box.

Fluff doesn’t add value.

The challenge of ensuring people take the right level of the right risks to achieve objectives remains.

That is why people seem (as they should) to be moving towards some form of risk limits (“risk criteria” in ISO:31000 language).

Alexei’s guide is about limits or guidance for specific sources of risk, rather than an “amount of risk”. When he refers to risk appetite for this or that risk, he is not using the term in the way it is defined by COSO. I would translated his use of appetite to risk limits or criteria.

If you want to tell people how much risk to take in certain well-defined repetitive decisions and situations, risk criteria or limits can work – if you can define that level of risk in practical terms.

For example, it may make sense to establish a limit on the amount of credit that may be granted to a new customer. However, for that risk limit to be the right one for the organization, it can’t be arbitrary. It has to be the result of an analysis (updated periodically) that balances the risk of credit default or slow payment against the additional revenue and cash generated by higher limits.

It may make sense to define how excess cash at the end of each day is invested. But that policy and its risk limits have to be reviewed and updated (if needed) periodically. See this Storytime with Norman video.

The test of a risk appetite statement, risk tolerance, risk limit, or risk criterion, is whether it leads to taking the right level of the right risk to achieve objectives.

====================================================================

Time to lay out a series of steps for assessing your risk appetite statement or risk criteria.

  1. Does it meet the needs of the organization? Does it help decision-makers take the right level of the right risks, balancing “risk and reward”, to achieve organizational objectives? Is it measurable, rather than expressed as “low”, “medium”, etc.?
  2. Does it help both strategic and tactical decision-making across the extended enterprise?
  3. Are there adequate controls over the (risk) analysis behind the risk limit? Does it use reliable, complete, accurate, and current data? Are any calculations correct? Is judgment properly applied and the right people involved?
  4. Is it periodically reviewed to ensure that it remains appropriate in a dynamically changing environment?
  5. Does it encourage risk-taking (including taking more risk) when that is right for the business? Are people adequately or overly risk-averse?
  6. Do people understand it and how it relates to their decisions and their responsibilities?
  7. Do people believe in it?
  8. Is it followed and what happens with exceptions?
  9. Does it help the board and executive leaders understand the decisions being made and whether the right levels of the right risks are being taken to achieve objectives?
  10. How can it be improved, made more practical and value-add?

Or is the risk appetite statement fluff designed to meet the regulator’s mandate, adding little in terms of value?

You can probably tell that I am not a fan of risk appetite or risk appetite statements. But if an organization deploys risk limits or criteria that work and calls them, as a whole, their risk appetite statement, I might be able to go along with the idea.

However, I remain convinced that the path to success is by understanding how what might happen (i.e., risk, both positive and negative) might affect the achievement of enterprise objectives.

I’ve said a lot! What do you think?

Auditors need to understand risk management.

June 29, 2023 9 comments

The current internal auditing standards from the IIA (not the draft update) require internal auditors to assess the adequacy of management’s risk management processes.

That sounds fine, but why?

Don’t do something just because the standards dictate it: there has to be a good business reason.

In this case, the reason is that if management does not have an adequate set of processes for understanding and addressing what lies ahead (i.e., both the potential positive and negative effects of risk – see below), they are not assured of making the right business decisions to achieve their objectives.

In other words, inadequate risk management (failing to understand and consider what might happen) is itself a source of risk to objectives.

If management is not making business decisions, both periodic and daily strategic and tactical decisions, with a reasonable understanding of what may lie ahead, is it likely they are making the right decisions? (I refer to them as “informed and intelligent decisions”.)

Note that I am not talking about risk as being limited to the downside, even though most “lists of risks” published by consultants (and the IIA) only talk about bad things that might happen.

Both COSO and ISO, and corporate governance codes such as the South African King IV report, define risk as having either (or both) positive and negative effects. (King IV makes this clearer by talking about “risk and opportunity.”)

I prefer to talk about “what might happen” rather than using a term that has negative connotations[1].

When I advise auditors who want to assess risk management, I ask them to ask this question:

Does the management of risk meet the needs of the organization?

I don’t want the auditor to assess compliance with corporate policies around risk management. I don’t even want them to assess conformance with a risk framework or standard (such as COSO ERM or ISO 31000). Neither provides assurance that risk management meets the needs of the organization.

I want them to determine whether risk-related processes are helping management and the board make the informed and intelligent decisions necessary for success (in the eyes of those decision-makers).

Does that mean that a list of the more significant (adverse) risks has no value? No. Management and the board need to understand them and make related informed and intelligent decisions.

But not in a vacuum!!!

Think about your personal life.

  1. Do you and your partner have and maintain a list of the more significant risks, such as fire, earthquake, loss of jobs, car accidents, grocery or gas price hikes, etc.?
  2. Do you have a periodic meeting with your partner, maybe a family meeting, to review and discuss the more significant risks?
  3. When you make a decision such as where to go on vacation or which car to buy, are you guided by your risk appetite statement?

No.

When you make decisions, whether small ones such as when to leave for work or major ones like whether to ask your partner to marry you, you consider both what you are trying to achieve (“reward”) and what might happen along the way (“risk”). You make a balanced decision, weighing the benefits against any potential harm.

Now let’s consider how the CAE runs the internal audit department.

  1. Does the CAE maintain a list of the more significant risks?
  2. Does the CAE have a periodic staff meeting to discuss and review those risks?
  3. Does the CAE use a risk appetite statement to guide decisions such as what engagements to put on the plan, who to assign, and the budget to be allotted to each project?

No.

So why should we think management and the board should do what we don’t do ourselves?

In fact, surveys consistently show that boards and management teams find little actionable information in the risk reports they are receiving.

This is what I said in Understanding the Business Risk that is Cyber: A guide for both business executives and InfoSec managers to bridge the gap:

Most leaders are not able to see the whole picture. Frankly, there are two reasons for that:

  1. They are not shown the big picture. Typically, they are shown a list of [downside only] risks by the CRO, and an assessment of cyber risk (out of context) by the CISO.
  2. They don’t ask to see the big picture!

A 2022 survey by the Diligent survey of US cyber risk practitioners found:

82% of respondents do not feel very confident in their ability to clearly communicate risk to the board.

A McKinsey article[2] repeats this when it says:

Most reporting fails to convey the implications of risk levels for business processes. Board members find these reports off-putting—poorly written and overloaded with acronyms and technical shorthand.

At a recent cybersecurity event, a top executive said: “I wish I had a handheld translator, the kind they use in Star Trek, to translate what CIOs and CISOs tell me into understandable English.”

Cyber risk information needs to be provided in business language that enables informed and intelligent decision-making.

It also has to be put into context.

In fact, every source of risk and opportunity, not just cyber risk, has to be put into context.

As I said before:

It’s not about managing risk; it’s about managing the business!

Making decisions about cyber risk out of context rarely makes good business sense!

In a post earlier this month, I quoted Lisa Young when she said:

In a recent ISACA survey, 66 percent of respondents said their executive leadership team [ELT] sees value in conducting IT risk assessments.

My guess is that they don’t see the value because the information generated by the risk assessments doesn’t help them make the business decisions necessary for success.

The information is one-sided. It only shows the downside, not the reasons to take risks.

None of these are effective devices when it comes to helping decision-makers make the informed and intelligent decisions necessary for success, taking the right levels of the right risks:

  • Risk registers
  • Risk heat maps
  • Risk profiles
  • Risk appetite statements
  • Risk models that only quantify the downside
  • Anything that shows downsides in a way that cannot be compared either to upsides or other sources of risk

Unfortunately, the IIA has not been a reliable source of training or guidance when it comes to understanding how risk management needs to function if it is to deliver what the organization needs for success. (This guidance was an exception.)

So where can auditors find the training and guidance they need to understand risk management (apart from my books)?

I welcome your comments, especially suggestions for where auditors can find training or materials that will help them understand how risk management can be effective.

They should understand what effective risk management is before they try to assess it!

===================================================================

[1] See Risk Management in Plain English.

[2] Cyber risk measurement and the holistic cybersecurity approach, November 2018.

[3] I am an Honorary Fellow of IRM and a member of the advisory board of IRM: India