Wonder Woman — “Reasonable Assurance” and Cybersecurity
One of my good friends, Brian Barnier[1], has written an interesting piece on cybersecurity targeted at internal auditors but also relevant for other practitioners. Brian is one of the smartest people I know (I am lucky to know and learn from so many) and an expert on technology and financial management. He is also the author of a couple of risk management books.
His article, which I show in its entirety below, suggests that we need to use “design thinking” to assess whether an organization’s cybersecurity meets its needs.
Brian doesn’t tackle the question of whether you assess cyber based on risk to information assets (NIST. ISO, and FAIR) or risk to the achievement of business objectives (Marks, et al).
But I recommend reading and considering his point of view.
I welcome your comments. You can also contact Brian directly (click here for his LinkedIn profile).
XX
=====================================================================================
Wonder Woman — “Reasonable Assurance” and Cybersecurity
Brian Barnier & Prachee Kale
An award-winning film director and her CISO sister are enjoying dinner al fresco. Savoring wine under glowing fairy lights, they compare professional notes …
“Paula, every conversation about your cyber stuff grows my world beyond the art, logistics and risk management of films. Yet, controls, lines of defense, insider threats call to mind Oceans 11 or Wonder Woman, Diana’s quest to vanquish Ares, the God of War.”
“Sasha, how can you say Wonder Woman?”
“Paula, films comprise two typical discoveries. A character’s self-discovery and that of its world. Diana, despite her intelligence and strength, was wrong but quickly adapted. In cybersecurity, how is creative thinking developed? How are problems framed? How quickly do people change?”
Stepping away from the sisters…
What – exactly – is a “control?”
Pause your reading of this article. Write down your definition of “control.” Now ask five colleagues to do the same and compare notes.
Surprised?
If you walk through the woods with specialists — ecologist, entomologist and businessman –each will have different observations. Cognitive biases cause people to force-fit their mental models on experiences and concepts.
Investigating “controls,” we discover two origin stories.
- Financial reporting controls trace back to ancient Egyptian grain accounting[i].
- Automated controls trace to ancient Greek fishing and hunting gear[ii]. They developed into Leonardo da Vinci’s machines, like the cam hammer.
Historically, accountants were cautious about applying financial reporting-style controls to business operations. In 1980, in a seminal study funded by the Financial Executives Institute (FEI), the authors “…found it very difficult, if not impossible, to develop a list of significant procedures that a company must perform or be judged lacking in internal control.[iii]”
Michael Cangemi, former CEO of FEI and COSO Board Member recalls, “I explored auditing internal control for Foreign Corrupt Practices Act compliance when I joined Phelps Dodge as Chief Audit Executive in 1980. Companies have always developed processes for ensuring the protection of assets and internal control. I found that internal control is different in every company, does not easily lend itself to frameworks or checklists and requires much more subjective auditing.”
What is NOT a “cybersecurity control?”
As detailed in “Cybersecurity: The Endgame – Part One,” an unintended consequence of the Sarbanes-Oxley Act was the application of financial reporting-style controls to cybersecurity.
Dan Goelzer is the author of an insightful newsletter on PCAOB activities, Retired Partner, Baker McKenzie, and former Acting Chair, PCAOB.
He observes, “Operational controls are only secondary to financial reporting controls in the sense that, if they fail, you ‘only’ might go out of business – potentially devastating to you, your investors and your employees. If you don’t have good ICFR you might, at least in theory, go to jail. People should not, but sometimes do, confuse ICFR with cybersecurity controls. Preventing and repelling cyber-attacks is far beyond ICFR.”
The two types of controls are entirely different in design for entirely different purposes.
- ICFR – manage risk of accurate recording of financial consequences of tangible transactions that occurred in the past in a relatively stable system
- Automated – manage risk of cascading situations in the future in a dynamic system
Applying ICFR-style controls to cybersecurity is a definition error. Would you fly in a plane with ICFR-style controls?
Paul Sobel, former IIA Chair and current COSO Chair, summarizes based on the specific definitions of each type of control…
“When facing cyber risks, ‘reasonable assurance’ is not sufficient. ICFR with reasonable assurance was not designed to provide ‘as close to absolute assurance as possible.’ Lessons learned from designing industrial control systems can provide that assurance. Also, dynamic methods of managing risk are needed to survive in the fierce world of cyber-attacks.”
Wonder Woman embraced the unassuming Sir Patrick. His demeanor gave her reasonable assurance that he couldn’t be Ares. Diana was wrong.
For auditors, chasing the wrong types of controls is life on a gerbil wheel – high risk, little business impact, monster spend and unfulfilling.
Another false sense of security and blind spot was Diana’s “god killer” sword. It slew Ludendorff, but Ares casually destroyed it.
The misapplication of ICFR-style controls contributes to breaches, waste and pain. It warrants fixing with safer solutions.
- Cyber is a system so apply systems thinking
- Power-up cybersecurity and drive better business outcomes. Apply design thinking – the vanguard of cybersecurity.
Beginning steps:
- Eliminate futile ICFR-style controls for cybersecurity
- Fix ICFR-style controls that are helpful, such as IT systems hygiene. But realize 1) they lack reliability of automated controls, 2) cost is excessive and 3) they can distract from safer actions.
- Focus on automated-style controls that work like IT systems reliability and engineering
- Outthink cyberwarfare enemies — embrace robust scenario analysis. Ask, “Would the scenarios make a good film?[iv]”
Here is a key challenge… the struggle to change has been researched since Aristotle, Plato and Thucydides, even in life-threatening situations. Organizational mass and inertia resist change. Overcoming requires a catalyst.
Surprise — the catalyst for improvement is you!
Let’s finish our walk in the woods. As an auditor, compare your view to the ecologist who sees the wood’s ecosystem and the businessperson who sees its financial value. Individually, each specialist is limited to one’s discipline and biases. You miss the 3-D view. You expand your influence and impact by seeing what others miss.
Making change easier
- Reframe to clarify the real problem. Symptoms often mislead – discover alternative diagnoses, think differently. View Cybersecurity as a system – the whole is greater than the sum of its parts.
- Address “hardwired” resistance. Have powerful but safe conversations and factor different perspectives to find root causes. Offer choices and reasons for change.
- Design the shortest path to an ideal future
Find accelerants — a transformation leader, an innovation/design lab or a professional coaching program. Why aren’t auditors coached and invited to such labs? Primarily because audit isn’t viewed as value-creating.
It’s worth its weight in palladium to partner with coaches and innovators to generate the gift of value.
Design thinking, including envisioning alternative futures, is powerful. Facing cyberwarfare, consider five futures:
- Same cybersecurity methods, no change — worst future
- Same methods, more money and run faster — degraded future
- Minor improvements, more money and run faster — static future
- Cutting and/or fixing ICFR-style controls, onetime spend, improved operations — better future
- Fully fixing ICFR-style controls, applying automated controls, and shifting to a systems and psychology approach — best future
Which one would you pick?
In summary:
- ICFR wasn’t designed for cybersecurity
- The opportunity cost of inaction is very high
- Valuable change is based on systems thinking and psychology, applied with design thinking
- Your personal opportunity — generate the gift of value
Back to the sisters…
“Sasha, creativity and design seem to have much to offer cyber!”
“Yes Paula, so, how fast will you fix it? I know you’ll be my hero!”
Note: This article was adapted from Brian Barnier & Prachee Kale (2020) CYBERSECURITY: THE ENDGAME – PART ONE, EDPACS, Taylor & Francis[v]
Disclaimer: Views expressed by the authors are their own and not necessarily those of their employers.
[1] Brian, Michael Rasmussen, and I were the first three honored as Fellows of the Open Compliance and Ethics Group (OCEG).
[i] https://news.uchicago.edu/story/archaeologists-find-silos-and-administration-center-early-egyptian-city
[ii] https://collections.mfa.org/objects/153702
[iii] https://www.amazon.com/Internal-Control-U-S-Corporations-Research/dp/0910586330
[iv] Two chapters of The Operational Risk Handbook are devoted to scenario workshops https://www.harriman-house.com/the-operational-risk-handbook-for-financial-companies
[v] https://www.tandfonline.com/doi/abs/10.1080/07366981.2020.1752466
-
March 3, 2021 at 3:36 PMWonder Woman — “Reasonable Assurance” and Cybersecurity - RISK OWNER by RISK-ACADEMY
-
March 3, 2021 at 11:10 PMWonder Woman — “Reasonable Assurance” and Cybersecurity | РИСК-АКАДЕМИЯ - АНО ДПО ИСАР
-
March 19, 2021 at 4:24 PMDon’t leave cyber security to the CISO | Norman Marks on Governance, Risk Management, and Audit