Google has announced that its Google Apps for Business has earned the international security standard ISO 27001 certification following a nine-month auditing process.
The cost of achieving regulatory security compliance is on average $3.5 million each year, according to a survey of 160 individuals leading the IT, privacy and audit efforts at 46 multinational organizations
# The organization must understand which frameworks or framework elements are needed to address, at a minimum, the critical security concerns. When addressing control requirements, more is not necessarily better, and each additional control entity represents an investment in time, money, and effort.
# Choose a base framework to use. An organization should identify a base framework to contain the additional controls. This framework should be as broad as is viable, allowing for only minimal, more specific needs to be addressed.
# Break the identified framework elements down according to functional areas and combine controls into like families or tiers. Different frameworks often contain equivalent controls under different headings or focus areas. By understanding where the controls map to one another, existing controls can often simply be enhanced rather than having to add completely different compliance needs.
# Identify critical controls that address the most restrictive requirements. In many situations, there will be control objectives that must be accomplished, intermingled with additional categories that are simply “good-to-have”. The action items that are required for compliance needs should be categorized as more critical.
# Define control “numbering system” and nomenclature. For ease of evaluation and tracking, the combined framework elements should be indexed in a way that allows them to be viewed as parts of a whole. In addition, a formalized control language should be used to address concepts across the new framework, avoiding confusion as compliance efforts begin.
# Identify affected data. Just as it was necessary in the first step to identify which controls and frameworks were needed, it becomes necessary to reverse the process, ensuring that all elements of data that are subject to the collected controls. The majority of this information was known at the start of the exercise, but a second glance after consolidating the requirements often identifies additional data sources, repositories, and systems.
# Understand data flows. As critical as it is to understand the affected data elements, it is just as important to understand where those data elements reside and why. How the information is collected, processed, stored, and transmitted is essential to determining in-scope systems, applications, and processes that must adhere to the new framework.
# Formally define scope of data controlled by the frameworks. After identifying the data flow patterns and practices, a consolidated list of servers, systems, applications, processes, and governance items must be created and then reviewed against expected values.
# Reduce data scope aggressively. Each data control element is an investment in time, money, and effort. The same can be said for each element of the in-scope data that is addressed by the combined framework. Existing business processes and needs should be used to determine if data is being used or retained in inappropriate or unneeded areas. Where possible, data should be consolidated and purged, reducing the overall scope of control coverage, especially critical control requirements such as those brought on by legal or regulatory provisions. (Editor’s note: see Ben Rothke and David Mundhenk’s guidance on reducing PCI scope.)
# Classify affected data according to impact. Some controls will be identified as more critical, and the data elements associated with these will likewise be viewed as more sensitive. These classes of information assets should be classified and labeled to ensure that adequate attention is applied.
# Define data lifecycle elements based upon classification levels and requirements identified by various standards and practices. Once the combined framework controls are in place; the data is identified, scoped, and minimized; and classification levels have been established, a comprehensive data lifecycle program should be implemented. Through this process, end users can manage data elements, complying with the chosen control framework requirements without having to conduct extensive research into sometimes arcane control sets.
# Review existing infrastructure, policy, and procedure against the consolidated framework and data lifecycle requirements. Governance and operational resources must be reviewed against the newly developed framework and associated lifecycle elements. Where needed, changes should be made to support the new controls system.
# Implement consistent solutions across all data elements located within the tier. The supporting processes that enable the controls effectiveness should be viewed from the perspective of consistent, modular growth. Networks, systems, and management tools should be designed to scale or be replaced easily. Consolidated security programs (such as incident response, vulnerability management, and change management) and scheduled requirements (audits, penetration testing, vulnerability assessments, risk assessments, and reports) should be updated to address all required controls across the entire framework, resulting in a consistent, singular approach to compliance and readiness.
… When we asked the 379 respondents to our InformationWeek Analytics survey on regulatory compliance how many requirement sets their organizations are addressing, the No. 1 answer was four or more, at 35%.
By now, many of you have read the newly released ISO 31000 Risk management — Principles and guidelines standard. (Others may have seen its release draft or be familiar with its predecessor the AS/NZS 4360 standard.)
It provides a well-written, step-by-step guide to risk management processes that can be applied to whole organizations, or any part thereof. So far, it has received well-deserved praise for its surprising brevity and consolidated value. These are especially important characteristics for a document with as lofty a goal as standardizing what it calls “an integral part of all organizational processes.”
Enterprises are becoming more and more proactive about data security, with data encryption viewed as a core element to their defensive measures. They are adopting encryption at a rapid rate to comply with industry regulations, protect intellectual property, obtain safe harbor from data breach disclosure laws, and effectively manage risk. As encryption proliferates, IT professionals are making critical decisions that directly contribute to, or detract from, an organization’s ability to effectively manage encryption keys and data security.
Data is an organization’s most valuable asset and it must be protected. Designing and implementing an encryption strategy is not complicated if you understand the needs of your enterprise and establish the right decision-making criteria for encryption solutions.
Simplicity, breadth, manageability and efficiency are the primary requirements security-minded enterprises must build into their encryption strategy. A solution that has the fewest complications will make the jobs of IT professionals easier, be more cost- and time-efficient, while at the same time protect data and meet compliance standards. Here are ten simple rules for evaluating encryption and key management solutions to ensure that the investments you make today deliver strategic value for the future.
1. Encryption shouldn’t have to be painful
Encryption is necessary to secure data at its source. It provides safe harbor from data breach disclosure laws and is mandated by industry standards such as the Payment Card Industry Data Security Standard (PCI DSS) and Health Insurance Portability and Accountability Act (HIPAA). Why then are many enterprises hesitant to adopt encryption? Often, the thought of high implementation costs, changes to applications, complexity and performance degradation prevents enterprises from making smart decisions regarding data security. Encryption technology has evolved immensely in the past few years. New approaches offer cost-efficient manageability combined with stellar performance and application and database transparency. Instead of dealing with negative perceptions about encryption by disregarding the issue, spend some time learning about the new approaches to database, application and file encryption.
2. Beware of point encryption product explosion
The more encryption products an organization has, the bigger the system management and policy management problem becomes. Avoid ending up with an exploding number of encryption products and all the related key management and policy management headaches that this will bring. Selecting encryption solutions that have the broadest coverage over the largest number of potential systems will eliminate management headaches, as well as homogenize and consolidate data security policy management.
3. Understand the EKM problem/solution area
The primary purpose of an Enterprise Key Manager (EKM) is to provide a centralized point of key generation, key lifecycle management, key backup and key recovery. When developing an enterprise encryption strategy, it is important to remember that the need for an enterprise key manager grows in line with the number of points for key storage. In addition, enterprise key managers are passive, meaning that they do not actively control the security of the encryption keys as they are handled by the encryption system. Furthermore, a complete encryption solution also includes secure access controls to prevent unauthorized access to sensitive data. This is not something that can be addressed by an EKM alone. A comprehensive encryption strategy requires a hard look first at the methods by which keys are handled by the encryption system and,
second, at the overall key management complexities associated with the enterprise encryption program.
4. Understand the importance of IKM
Integrated Key Management (IKM) is the actual key management structure of an encryption system. This key management process actively controls the methods by which keys are stored and accessed by the encryption system. IKM differs from EKM in that it directly controls the security, storage and handling of keys as they are accessed within the encryption solution. Integrated key management must be a critical part of the evaluation criteria for any encryption solution. If integrated key management is secure and transparent, the management overhead of an encryption solution will be significantly minimized. It is critical to remember that the need for an EKM will grow directly in line with the number of encryption systems that have been adopted because backup and recovery of encryption keys will become a larger and larger problem as the number of places that keys are held also grows. Selecting solutions that provide integrated key management for the largest number of required encryption end points will go a long way towards eliminating the enterprise key management problem.
5. Learn from the success of SSL
SSL is the most broadly used encryption method in the world today simply because it is transparent to users and applications and is easily managed. As demonstrated by the success of SSL, the more transparent the encryption solution, the more easily it can be integrated and supported for the long term. Organizations seeking success in implementing encryption should make transparency an important part of their decision-making criteria. Without transparency, encryption solutions can take as long as twelve months to install and cause significant costs during application change processes. With transparency, encryption can be implemented within days, and never needs to be considered as an inhibitor as the organizations seeks to optimize their information management programs.
6. Look beyond the column
While column-level encryption can intuitively seem like the most practical method to encrypt database data, its invasiveness and lack of scalability make it inefficient, offer limited protection and, sometimes, render it unusable. Column-level encryption is not transparent to databases or applications. This lack of transparency can drastically complicate application change management requirements, require a significant amount of customization of both the database and the application and places the performance burden directly on the database itself. Furthermore, as projects to protect a single column, such as credit card numbers, evolve into broader data protection requirements for personally identifiable information (PII), the number of columns that need to be encrypted explodes, drastically hurting database performance and raising implementation costs. Most importantly, databases send sensitive information to a myriad of locations, including database log files, application log files, document outputs to servers, and backups. Column-level encryption offers no protection for unstructured data. Before leaping to undertake a potentially extensive and long column-level encryption project, organizations should fully educate themselves on the costs and benefits of every approach to database encryption, including database file level and column-level encryption.
7. Prepare for virtualization
Virtualization changes the overall security model. Because the operating system is not tied to a physical disk, it can be moved from system to system. Full disk encryption and physical security, which have been broadly implemented to protect operating environments and the data housed within them, lose their effectiveness in virtualized environments. Instead of stealing the physical disk, entire operating environments can be logically accessed and easily transferred. Organizations that plan on implementing virtualization should re-evaluate their data and system protection mechanisms in light of the new security risks. Implementing data encryption that travels with the operating system environment in conjunction with or instead of full disk encryption will go a long way as the use of virtualization exponentially increases throughout enterprise infrastructure.
8. Policy is key
Encryption is easy – getting decryption right is hard! By combining encryption with an access control based decryption policy, the value of encryption grows from a simple scrambling of bits as a physical theft deterrent to a dynamic data security solution that places controls directly on the data itself. To gain strong security benefit from their encryption projects, organizations should look for solutions that not only scramble bits, but apply security policies on the data itself.
9. Consider all applications and operating systems
Many encryption solutions are tied to specific versions of applications and operating systems. For example, enterprises typically have numerous versions of the same database across various parts of the enterprise. They also have numerous databases running on a wide array of different operating systems. While it seems natural to implement encryption solutions that come as part of the application, this leads to an explosion in the number of encryption solutions. If encryption is only available for a specific version of a database for example, but enterprises are unable to update all of their databases to the most up to date version, it leaves them with a hole in their overall security solution. Furthermore, training costs increase with a wide array of point solutions that are tied to the application or the operating system. Solutions exist today that, due to the transparent nature of their operation, can cover all applications across multiple operating systems. This allows the enterprise to deploy a single solution, reduce their key management issues and minimize both implementation and administration costs.
10. Think of encryption as an enabler
Encryption can help your business. Data security is a proactive way to comply with government and industry regulations and ensure customer confidence. Regulations like the Gramm-Leach-Bliley Act (GBLA) of 1999, California SB 1386, California AB1950, HIPAA, and PCI DSS require enterprises to protect sensitive information with penalties for noncompliance, such as hefty fines and litigation. In addition, in the event of a data breach, an enterprise will suffer damage to its reputation and the loss of customer confidence. By using encryption, an enterprise demonstrates its proactive dedication to data protection.
Encryption should no longer be feared! Enterprises can find effective, cost-efficient solutions and strategies that allow their business to gain the benefits of a broad data security program without changing their applications or requiring their administrators to learn multiple different solutions. Thanks to advancements in encryption approaches, solutions now exist that secure data without creating management complexities or performance nightmares.
Steve Pate is the chief technology officer for Vormetric.
Welcome to this website.
The intent of this project is to bring together a collection of information about software that enabled private and government organizations to comply with existing and new regulations.