Home > Audit, COSO, Cyber, Governance, GRC, IT, security, Technology > New information and perspectives on cyber security

New information and perspectives on cyber security

The world continues to buzz about cyber security (or, perhaps we should say, insecurity). Now we have the Chinese government apparently admitting that they have a cyberwarfare capability: not just one unit, but three. Other nations, including the United States, Japan, and some European nations, are talking about their ineffective defenses and the need to develop an offensive capability.

What can the targets, not only any public or private company, but each of us as an individual target (yes, our personal devices are constantly under attack), do about this?

The first step is to get our collective heads out of the sand and understand that we are all, collectively and individually, at risk. The level of successful attacks is enormous (a billion records with personal information were hacked in 2014 according to IBM, as reported here). According to a survey discussed in Fortune, 71% of companies admit they were hacked last year and the majority expects to be hacked this year. However, nearly a quarter, according to Fortune, has not only kept their heads in the sand but do so with unbelievable confidence; they think a successful cyber attack is “not likely” in the next 12 months. The trouble is that very often successful attacks are not detected! It took a long time before JPMorgan Chase found out they had been hacked, and even longer before they knew the extent of damage.

Organizations need to be ready to respond effectively and fast!

The JPMorgan Chase article reports that “The people with knowledge of the investigation said it would take months for the bank to swap out its programs and applications and renegotiate licensing deals with its technology suppliers, possibly giving the hackers time to mine the bank’s systems for unpatched, or undiscovered, vulnerabilities that would allow them re-entry into JPMorgan’s systems.”

All is for naught if successful intrusions are not detected and responses initiated on a timely basis. In the Target case, reports say that the security monitoring service detected suspicious activity but the company did not respond. According to ComputerWeekly.com, many companies make the mistake of “Over-focusing on prevention and not paying enough attention to detection and response. Organisations need to accept that breaches are inevitable and develop and test response plans, differentiating between different types of attacks to highlight the important ones.”

Another insightful article discusses the critical need for pre-planned response capabilities. IT cannot do it all themselves; business executives need to not only be involved but actively work to ensure their operations can survive a successful intrusion.

What else should we do?

We have to stop using passwords like ‘password’, the name of our pet, or our birthday. Password managers are excellent tools (see this article on the top-rated products) and merit serious consideration. I have one (BTW, I don’t plan to replace it with the latest idea from Yahoo of one-time text messages. However, I do like the fingerprint authentication on my iPhone.)

A risk-based approach to cyber security is the right path, in my view. But that does mean that organizations have to continuously monitor new and emerging risks, or new observations about existing risks. An example is a new article on insecure mobile apps – both from in-house developers and from external sources.

Organizations need to allocate resources to cyber and information security commensurate with the risks, and individuals have to take the time to update the software on their personal devices. Internal audit departments should make sure they have the talent to make a difference, providing objective evaluations and business-practical suggestions for improvement.

Companies and individuals, both, need to make sure they apply all the security patches released by software vendors. They address the vulnerabilities most often targeted and when there is a breach, very often it’s because the patches have not been applied.

As individuals, we should have a credit monitoring service (I do), set up alerts for suspicious activity on their bank accounts, and all the anti-virus and spam protection that is reasonable to apply.

Finally, as individuals and as organizations, we need to make sure we and our people are alert to the hackers’ attempts through malware, social engineering, and so on. It is distressing that so many successful intrusions start with somebody clicking where they should not be clicking.

Here are a couple of articles worth reading and a publication by COSO (written by Deloitte) on how their Internal Control Framework can be used to address cyber risks.

Cybersecurity in 2015: What to expect

Cybersecurity Hindsight And A Look Ahead At 2015

COSO in the cyber age

As always, I welcome your comments.

Advertisement
  1. March 21, 2015 at 12:02 PM

    I’ll add a few additional comments to muddy the waters just a little.

    1. At a recent seminar conducted by a couple of white-hat hackers for CPAs, one of the presenters noted the potential for social engineering for system penetration. This prompted me to ask if the growing prevalence of staffing with temps is leading to an increase in exposure to social engineering. The premise for the question was that temps are less likely to be effectively indoctrinated into security measures, and more likely to be vulnerable since they don’t know the organization, and are less motivated to because…….they are temps! The white hat demured, and said there is not yet evidence that is a problem. I would respond that the lack of evidence does not assure the lack of a problem.

    2. Individually and collectively, I can’t imagine that we can do enough if system manufacturers and developers build back doors into their systems, either for their convenience or the governments. Here the ‘greater fool than I’ fallacy kicks in: that is – ‘I’m smarter than the other guy. I can build the bullet-proof doorway and control it.’ Murphy’s postulate 73 applies here: everything works….until id doesn’t.

    3. A couple of years ago, the information industry media promoted this nonsense about how young workers DEMAND compatibility or co-existence or accommodation or whatever with their personal devices and access to social media of all flavors. Utter nonsense. As the recent Hillary hilarity should drive home, keep your business on business devices; your personal life on personal devices and let’s everyone understand and respect boundaries.

    4. I suspect there’s an ‘honor among thieves’ conundrum at play in the insistence of federal government to play a role in corporate information security and the resistance of corporations to collaborate. First, the corporations do not trust the intentions of the government, and with good reason. Second, I suspect, too many corporations have reason to fear federal eyes peaking into corners that corporations would not want them to go for reasons that the feds would have every right to know. Checkmate. At the expense of the rest of us who depend on both.

    This is problematic because corporations cannot be allowed to wage cyber warfare in their defense, and the government cannot mount an adequate defense, particularly of critical industries such as banking and the grid and transportation, without close collaboration with corporations.

    5. While I am wary of the intentions of the US government, I have no uncertainties in my mistrust of Facebook and Google and Yahoo in their security protocols and management intentions. To whatever degree corporations and individuals choose to interface with these for-profit entities and invite them into their cyber lives, I suspect that they invite another door for intrusion that they do not need.

    6. Finally, it is classic American arrogance that we have engaged in cyber attacks, with the reasonable presumption that there are counter-parties with at least equal capability and greater motivation, and with sufficient knowledge of our own areas of vulnerability….but with grossly inadequate defenses against reprisal. These risks did not arise overnight. Y2K gave us more than adequate insight into our system management vulnerabilities and proclivities for procrastination on priorities.

    • March 22, 2015 at 8:39 PM

      Great comments, Sidney. To answer number 1, I’d say that it’s a resounding “yes”. At a Deloitte event a couple of weeks back, one of the presenters told the crowd he’d come across no fewer than four cases in the past year of employees implanted to stolen ID’s. Background checks all clean, but they weren’t who they said. All four were suspect in separate cases of various forms of theft, spying – and in one case, sabotaging oil industry equipment on behalf of a radical eco-group.

  2. March 21, 2015 at 11:37 PM

    I would like to also say that the US have been introducing backdoors for years. They have agreements with security software providers to include these backdoors in the default config.

  3. March 22, 2015 at 6:30 AM

    ….And one other thought, courtesy of the Y2K experience:

    7. Supply chain vulnerability.

    An entity’s cyber-security is only as good as its ‘extended system environment’. By this, I mean not merely the systems secured within the boundaries of its corporate entity, but those to which it is connected as part of its extended virtual ‘organization’ in data sharing as integral components of its service\supply chain.

    In the lead-up to Y2K, too many organizations remained complacent in their preparations until the legal profession smelled the opportunity for litigation against entities whose systems failed with negative consequences for customers/stakeholders/etc. Then they got religion. But as they tightened up their own ship and battened down the hatches for the unknown, it also dawned on them that their posture was only as good as their correspondents up and down the supply\service chain.

    That perception set off a last-minute surge of CYA correspondence among supply\service chain collaborators to assure each other that everybody had tidied up their house for the calendar flip.

    It seems to me that the same exposure exists today, greatly complicating the task of securing the information environment, only far worse. Unlike Y2K, we are not looking at a moment-specific point of failure. We are operating in an dynamic environment of constant change among all parties. It necessitates a recognition of shared risk and shared responsibility for managing it among all affected parties. I do not sense that we as a society (particularly our techno-gunslingers in Silicon Valley, are willing to surrender independence (no matter how illusory) for security and accept the responsibilities of mutual cooperation that are imperative to a secure environment for all.

    We are our own worst enemies.

  4. March 22, 2015 at 4:40 PM

    Thanks, Norman, for this excellent summary of today’s situation.

    Any organization not mandating (and enforcing) complex passwords–and two-factor authentication for critical systems–is just not paying attention. Alas, many are not.

    The same goes for data encryption, for both data ‘at rest’ and ‘in motion.’ I was truly shocked to read in the WSJ that Anthem chose not to encrypt member data, for example (WHERE was their Board?).

    The prior comment mentioned weaknesses in partner sytem access, and I agree as well (target being penetrated via a vendor portal is the recent case in point).

    The list goes on and on (Sony losing track of firewalls, for instance)…but the failure is reasonably simple to explain:
    — A ‘tone at the top’ that allows sloppy IT AND sloppy risk management to be tolerated;
    — Indequate IT leadership that either doesnt understand the risks or can’t articulate a convincing business case to the CEO/Board (if the CIO even has a seat at the table from which to be heard);
    — Inadequate IT architctural planning that considers security something ‘bolted on’ rather than ‘built in’ when systems are designed and constructed.

    I must point out that our ever-increasing reliance on IT exposes firms to many risks besides recently well-publicized breaches: lack of adequate disaster recovery capability, ‘big bang’ projects that fail, data problems that lead to SOX failures, etc., can cripple a firm. And lack of investment oversight can leave a firm vulnerable to a more-nimble competitor or a disruptive market entrant.

    IT today requires a level of management understanding that transcends the IT team itself and extends across the C-Suite and up to the Board.

  5. March 22, 2015 at 9:00 PM

    Norman, to your comments I can only add this: there’s a strong need to “de-risk” the data we possess as organizations. The first thing I did with my current assignment, back in 2010, was pare back the PII possessed by my firm on behalf of our clients to the absolute minimum. Through successive queries to our clients we dropped SIN’s (equal to US SSN’s), home address and telephone.

    The major credit card vendors are working on tokenization systems whereby one-time tokens replace the credit card number itself in transit from purchase back to the issuer. Someone on a SIRA thread (Society of Information Risk Management, you can join the email list at https://www.societyinforisk.org/) reports that it’s working.

    I think we have to look at whether SIN’s/SSN’s can really serve the dual purpose of tracking taxes and credit, and take the things away from the credit bureaus. There’s no reason that a token system wouldn’t work there as well.

    The legal profession is moving into this space, and I think that all of today’s standards – from COSO to PCI DSS to the ISO’s to COBIT to SOC-2 audits may well be wiped clean as organizations and board members are held increasingly liable. I’m becoming more interested in what my insurance firm has to say than my service auditors.

    And the reason: we’ve been far too casual with people’s data for far too long. I experienced a full-blown PII breach and learned several interesting things. An officer with the privacy/fraud unit told me that ID theft is now as common here in Toronto as traffic violations. He also told me that theft involving my home mailing address could follow on to the first fraud that we indeed experienced. I changed credit card companies, insurance firms, and banks. I changed my home and mobile phone numbers. I’m using a mail box at the UPS store, and route everything but my driver’s license there. I have changed my birthdate at every organization I do business with (why on Earth did anyone think *that one* was a good idea?), and turned my profiles on social media websites to tangled webs of lies. No one has my mother’s maiden name as a security word. Credit monitoring will be a permanent part of our lives. All of this expense and effort because some trust company wasn’t cleaning up some records left on a website that had a lousy WordPress plugin that was accessible from China despite the trust company not being licensed to do business there.

    Until we de-risk the data, we’re making ourselves targets. And we will continue to pay.

  6. March 23, 2015 at 7:49 AM

    Hi Norman, you make a lot of excellent points but I’d like to challenge your views in a couple of areas.

    You ask that people use better passwords. I agree, but this is a tactic, not a long-term strategy. Reliance on passwords, though easy, common and cheap, was always going to be a stop-gap method, with fundamental flaws that people are reluctant to face because of the costs involved in adopting alternative approaches. All passwords have become steadily less secure over time, because increasing processing power, and the increasing size of botnet networks, makes it progressively easier for criminals to use ‘brute force’ methods to discover passwords, even if they look like a long string of random symbols! In addition, the general over-reliance on passwords means that each password database hack undermines the security of passwords in general, across all the resources that users access.

    I use the analogy of teaching kids to cross a road. Telling them to look and listen is a perfectly sensible approach to the risk of crossing the road. But the technique breaks down as the size of the road increases, the traffic volume rises and vehicles get faster. Eventually some roads get to a point where you can’t safely walk across them. Pedestrian bridges, subways, signal crossings and other solutions all involve cost and have their own downsides, but these alternative solutions become increasingly necessary. The same is true of access security. Companies and other organizations need to recognize that investment must be directed towards safer alternatives to passwords, for anything but the most basic resources. This means more use of two-factor authentication, and techniques where the user confirms their identity via messages sent to their phone in parallel to identifying themselves over the internet.

    Secondly, I completely agree that “Internal audit departments should make sure they have the talent to make a difference, providing objective evaluations and business-practical suggestions for improvement.” However, this is easier said than done. When I worked as an auditor, I found that few of my contemporaries had the ‘horse sense’ required to identify and evaluate information risk, because they lacked the practical understanding of how technology worked. If people don’t commonly work with technology, it’s hard for them to have an intuitive sense of the scale of risk. As a consequence, minor risks can be exaggerated (because they appear on a checklist, or the auditor learned about them by rote). Meanwhile, more important risks are not identified because that would require some imagination and an ability to synthesize the potential risks that apply to a novel situation. In short, there is no way to achieve your goal except by developing individuals who both understand technology and understand auditing – which is two expensive skillsets when employers are usually only prepared to pay for one. Technologists without audit skills make observations but without an appropriate sense of the organization’s priorities and goals. Auditors who lack technology skills have skewed priorities and miss risks. But the way that individuals are educated and developed, it’s difficult for Internal audit teams to recruit individuals who possess a strong combination of technology and audit skills.

    • Norman Marks
      March 23, 2015 at 7:56 AM

      I agree. However, until the systems we use allow us to use other methods, we need people to use better passwords.

      With the increasing use of personal devices on corporate networks, hacks of personal devices and home computers may allow access to corporate assets.

    • Norman Marks
      March 23, 2015 at 7:58 AM

      With respect to internal audit, I made sure that I had the technical resources available. In my youth, I was a bit of a techie so understood the need to have techie resources available when I was a CAE. Those resources can either be in-house (strongly preferred) or co-sourced.

  1. March 21, 2015 at 11:29 AM

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: