The recently released results of a security audit performed on the various systems used by the US-CERT to accomplish its cybersecurity mission revealed an unpleasant reality: a total of 671 unique vulnerabilities – 202 of which were high-risk – have been detected on the Mission Operating Environment (MOE) system.
I propose that ERM is worth doing and doesn’t have to be so complex if you simply “begin with the end in mind,” as Stephen Covey says in The 7 Habits of Highly Successful Security Leaders. Or would have said if he’d written such a book.
The basis of my thoughts is COSO’s ERM framework (link goes to a PDF of the Executive Summary). Here is the end to keep in mind as you begin your ERM efforts, taken from COSO’s work:
While everyone is worried about stolen laptops or unauthorized access to computer files, who ever thought the hard drive in copying and fax machines could be a potential HIPAA violation?
Copy machines, fax machines and scanners now contain hard drives — like computer hard drives — that store images of all the pages of information that ever ran through the machines, according to the Baudino Law Group.
The Des Moines, Iowa, law firm said after a copy machine was disposed by a New York-based managed care plan, the plan had to notify three state agencies, federal authorities and more than 400,000 members of a breach of protected health information under HIPAA.
Together they form what I’d call the “privacy GRC” market, where GRC stands for “governance, risk and compliance.” GRC makes up most of what privacy people do.
It’s not a big market. To put things into perspective, Gartner is only in its third year of analyzing the nascent IT GRC market. The privacy GRC market is at the moment no more than just a subset of that.
via Privacy software: Who are the early leaders? – software, security, privacy, ControlCase, Consult2Comply, brinQa, Avior Computing, Archer, applications, Agiliance – Security & Email – PC World Business.
There’s one section in the standard that is more important than any other, says Tom Wills, security and fraud senior analyst at Javelin Strategy and Research. Requirement 6.2 – “apply a risk-based approach for addressing vulnerabilities” – needs to become the over-arching requirement in the entire standard, he says. “This would mean all security controls should be based on carefully assessed risk, and not on following a checklist.”
Security that’s based on actual risk, not on rote compliance, is the only effective strategy to control against financial losses that result from compromised data. Wills wants to see the PCI council take section 6.2 from the middle of the document and put it in a headline position, with every other requirement rolling up to that. “That would send a clear message to the PCI stakeholders that security does not equal compliance, and that putting security first is what we need.
# The organization must understand which frameworks or framework elements are needed to address, at a minimum, the critical security concerns. When addressing control requirements, more is not necessarily better, and each additional control entity represents an investment in time, money, and effort.
# Choose a base framework to use. An organization should identify a base framework to contain the additional controls. This framework should be as broad as is viable, allowing for only minimal, more specific needs to be addressed.
# Break the identified framework elements down according to functional areas and combine controls into like families or tiers. Different frameworks often contain equivalent controls under different headings or focus areas. By understanding where the controls map to one another, existing controls can often simply be enhanced rather than having to add completely different compliance needs.
# Identify critical controls that address the most restrictive requirements. In many situations, there will be control objectives that must be accomplished, intermingled with additional categories that are simply “good-to-have”. The action items that are required for compliance needs should be categorized as more critical.
# Define control “numbering system” and nomenclature. For ease of evaluation and tracking, the combined framework elements should be indexed in a way that allows them to be viewed as parts of a whole. In addition, a formalized control language should be used to address concepts across the new framework, avoiding confusion as compliance efforts begin.
# Identify affected data. Just as it was necessary in the first step to identify which controls and frameworks were needed, it becomes necessary to reverse the process, ensuring that all elements of data that are subject to the collected controls. The majority of this information was known at the start of the exercise, but a second glance after consolidating the requirements often identifies additional data sources, repositories, and systems.
# Understand data flows. As critical as it is to understand the affected data elements, it is just as important to understand where those data elements reside and why. How the information is collected, processed, stored, and transmitted is essential to determining in-scope systems, applications, and processes that must adhere to the new framework.
# Formally define scope of data controlled by the frameworks. After identifying the data flow patterns and practices, a consolidated list of servers, systems, applications, processes, and governance items must be created and then reviewed against expected values.
# Reduce data scope aggressively. Each data control element is an investment in time, money, and effort. The same can be said for each element of the in-scope data that is addressed by the combined framework. Existing business processes and needs should be used to determine if data is being used or retained in inappropriate or unneeded areas. Where possible, data should be consolidated and purged, reducing the overall scope of control coverage, especially critical control requirements such as those brought on by legal or regulatory provisions. (Editor’s note: see Ben Rothke and David Mundhenk’s guidance on reducing PCI scope.)
# Classify affected data according to impact. Some controls will be identified as more critical, and the data elements associated with these will likewise be viewed as more sensitive. These classes of information assets should be classified and labeled to ensure that adequate attention is applied.
# Define data lifecycle elements based upon classification levels and requirements identified by various standards and practices. Once the combined framework controls are in place; the data is identified, scoped, and minimized; and classification levels have been established, a comprehensive data lifecycle program should be implemented. Through this process, end users can manage data elements, complying with the chosen control framework requirements without having to conduct extensive research into sometimes arcane control sets.
# Review existing infrastructure, policy, and procedure against the consolidated framework and data lifecycle requirements. Governance and operational resources must be reviewed against the newly developed framework and associated lifecycle elements. Where needed, changes should be made to support the new controls system.
# Implement consistent solutions across all data elements located within the tier. The supporting processes that enable the controls effectiveness should be viewed from the perspective of consistent, modular growth. Networks, systems, and management tools should be designed to scale or be replaced easily. Consolidated security programs (such as incident response, vulnerability management, and change management) and scheduled requirements (audits, penetration testing, vulnerability assessments, risk assessments, and reports) should be updated to address all required controls across the entire framework, resulting in a consistent, singular approach to compliance and readiness.
The Internal Revenue Service risked disclosing taxpayer information when it failed to identify contractors that had access to financial records and to fix known security weaknesses at facilities where files are stored.
According to an audit released on Tuesday by the Treasury Inspector General for Tax Administration, the IRS did not identify all the vendors that store and process taxpayer data, making it impossible to complete annual security reviews. In addition, at facilities where the IRS did conduct reviews, it failed to check if weaknesses it had identified were corrected.
The Office for Civil Rights (OCR) is responsible for issuing periodic guidance on the provisions in the HIPAA Security Rule. (45 C.F.R. §§ 164.302 – 318.) This series of guidance documents will assist organizations in identifying and implementing the most effective and appropriate administrative, physical, and technical safeguards to protect the confidentiality, integrity, and availability of electronic protected health information. The materials will be updated annually, as appropriate.
View the Draft Guidance on Risk Analysis.
The Health & Human Services Department published draft guidance to help healthcare providers and payers figure out what is expected of them in doing a risk analysis of their protected patient health information.
via In the News.
Today, OWASP has released an updated report capturing the top ten risks associated with the use of web applications in an enterprise. This colorful 22 page report is packed with examples and details that explain these risks to software developers, managers, and anyone interested in the future of web security. Everything at OWASP is free and open to everyone, and you can download the latest OWASP Top 10 report for free at: