Monthly Archives: October 2008

PCI DSS version 1.1 and 1.2 differences and updates

On October 1, 2008 the PCI SSC released version 1.2 of the PCI DSS requirements.  There are a number of changes as outlined previously in the update document.  The PCI SSC has established a life cycle process that will ensure the PCI DSS standard is revised and updated on a two year cycle.  What follows is a detailed outline of the differences between version 1.1 and 1.2 (some that have not been discussed previously) and the implications of those changes. (Unless otherwise noted, those items in quotations are taken directly from the PCI DSS or the update document linked above.)

Requirement 1

  • In version 1.1 : 1.1.5-1.1.7 discussed similar areas.  In version 1.2 : these requirements are combined into 1.1.5 – business justification and documentation of secure service implementation.
  • In version 1.1 : “quarterly review of firewall and router rule sets.”  In version 1.2 : “Requirement to review firewall and router rule sets at least every six months“.  “Now the control can be better customized to the organization’s risk management policies.”

Requirement 2

  • In version 1.1 : Requirement 2.1.1 said to “disable SSID broadcast”.  In version 1.2 : this sub-requirement has been removed.  Although it is still something you should do if you can, it’s no longer a show stopper.

Requirement 3

  • In version 1.1 : Requirement 3.4 said you could use, “Strong one-way hash functions (hashed indexes)”.  In version 1.2 : “One-way hashes based on strong cryptography.”  This may seem like a minor wording change but the emphasis is on “strong cryptography”, implying the system should repel attacks.  Although it does not say so specifically, one might imagine this means salting the hash value.
  • One will also notice the removal of mentions to SHA-1, Triple-DES, and AES or any specific key length.  The emphasis is on strong encryption, something you can read about on the NIST website.
  • Requirement 3.4.1 – Disk Encryption – references to “Active Directory” are removed, putting the focus on “local user account databases.”  This reminds us that if we choose to use technologies such as Encrypting File System (EFS), they cannot be reliant on the security of local user accounts.

Requirement 4

  • Requirement 4.1.1 – Removes discussion of WEP vs WPA and simply states that cardholder data must be security encrypted over wireless networks, and to “implement strong encryption for authentication and transmission.”  This is the first reference to ‘authentication’ implying that not only must the data be secure, but the authentication to that network must also be protected.
  • New implementations of WEP are not allowed after March 31, 2009
  • Current implementations must discontinue use of WEP after June 30, 2010

Requirement 5

  • At first glance it appears that version 1.2 reverts to an older form of the standard by mandating “anti-virus software applies to all operating system types” but it quickly clarifies the intent still as those systems “commonly affected by malicious software.”  Although the reference to UNIX is removed, it does state that companies should deploy on such systems “if applicable anti-virus technology exists.”

Requirement 6

  • Version 1.2 clarified the intent around 6.1 which states security patches must be installed within 30 days of release from the vendor.  “An organization may consider applying a risk-based approach to prioritize their patch installations.”  For example, companies might “ensure high-priority systems and devices are addressed within one month, and addressing less critical devices and systems within three months.”
  • Version 1.2 has updated Requirement 6.5 with the latest OWASP Top 10 list!  Furthermore the PCI DSS future-proofs itself by saying, “However, if and when the OWASP guide is updated, the current version must be used for these requirements.”
  • Version 1.2 makes Requirement 6.6 mandatory and rewrites the requirement to include many of the items outlined in the Information Supplement released previously.  It states, “public-facing web applications, address new threats and vulnerabilities on an ongoing basis and ensure these applications are protected against known attacks.”
  • The big change is in the wording of the first option in Requirement 6.6, which states, “Reviewing public-facing web applications via manual or automated application vulnerability security assessment tools or methods, at least annually and after any changes.”  The mention of code-review is removed and the emphasis is more on tools or methods that will identify vulnerabilities within the application.  I’m certainsome individuals will appreciate this.

Requirement 7

  • In version 1.2, the sub-requirements are clarified and flushed out.  Many of the bullet points in the audit procedures of version 1.1 are now their own requirement.  This is a great move to list the requirements as their own section in the “PCI DSS Requirements” section and not as “Testing Procedures”.

Requirement 8

  • Minor edits for clarification.  “Clarified that testing procedures must verify that passwords are unreadable in storage and transmission.”

Requirement 9

  • In version 1.2 there is a note of clarification for Requirement 9.1.1 stating that video cameras should monitor, “any data center, server room or any area that houses systems that store, process, or transmit cardholder data. This excludes the areas where only point-of-sale terminals are present, such as the cashier areas in a retail store.”  This brings us closer to the implication that “wiring closets” may need a video camera.  Where does the server room and and the wiring closet begin?  As always one should take a risk based approach when answering this question.
  • The one benefit to the above clarification is that it expands the scope of video monitoring into areas that contain paper files.  Many companies contain warehouses full of paper files, which under this clarification may require video monitoring as well.  The question will now be, how far does one take this?  Do you need video monitoring in an office environment with only a few papers?  The answer is that, like many thing in the standard, one should take a risk based approach toward answering the question.
  • Version 1.2 “Specified that offsite storage locations must be visited at least annually.”  This is only required when data is actually stored off-site, an item that is suggested but not required.

Requirement 10

  • In version 1.1 : the standard mandated that companies retain audit logs for “a minimum of three months available online.”  In version 1.2 : this is changed to, “Retain audit trail history for at least one year, with a minimum of three monthsimmediately available for analysis.”  This is to imply that the data need only be maintained for easy access in the event of a forensic investigation.  The options include: “online, archived, or restorable from back-up.”

Requirement 11

  • In version 1.2 : the focus on wireless is expanded to recommend the use of a wireless IDS/IPS, “Test for the presence of wireless access points by using a wireless analyzer at least quarterly or deploying a wireless IDS/IPS to identify all wireless devices in use.”  For more information on our analysis of wireless under PCI DSS v1.1, read the wireless FAQ.
  • In version 1.2 : Requirement 11.1 has more detailed audit procedures outlining criteria for escalating alerts from a wireless IDS/IPS and integrating the alerts into the required incident response plan.
  • The requirement (11.2) for external or Internet vulnerability scanning never explicitly stated an ASV must be used, even though we all knew it.  Version 1.2 clarifies that point where it, “Outlined that ASVs must be used for quarterly external vulnerability scans.”
  • Requirement 11.3 explicitly states in version 1.2 that both “internal and external” penetration testing is required.  This includes the Information Supplement released previously on this topic.

Requirement 12

  • In version 1.2 : the acceptable use policy for employees-facing technologies now includes, “remote access technologies, wireless technologies, removable electronic media, email usage, internet usage, laptops, and Personal Data Assistants (PDAs)”.  I don’t think the list will ever end, so simply make sure employees understand their role in protecting any technologies part of the cardholder data environment.
  • Requirement 12.6.2 changes the statement that employees must sign saying they have read and accept the requirements.  Version 1.2 states this must be done “at least annually”.
  • Another important change is that version 1.2 merges Requirement 12.10 into 12.8, changing the requirement for a vendor management program from just ’service providers and processors’ to all organizations including merchants.  The benefit is that instead of saying “ensure the entity is PCI DSS compliant” the standard (12.8.4) now says, “Maintain a program to monitor service providers’ PCI DSS compliance status.”  This basically means all companies must now work with their service providers to not only have a legal contract between them, but also push them towards validating their compliance status.

Appendix

  • The Compensating Controls section has been partially rewritten.  The criteria for using compensating controls now includes, “Provide a similar level of defense as the original PCI DSS requirement”, which if you ask me is all you really need in the first place.  If you can provide a similar level of defense then you do so by meeting the intent and rigor of the original requirement.
  • Also, the Compensating Controls Worksheet now includes two new sections titled: “Validation of Compensating Controls” and “Maintenance”.  This goes right to the heart of the matter, that auditors must evaluate and validate their compensating controls and maintenance behind securing the  environment on an ongoing basis.

Attestation of Compliance

  • In version 1.2 of the standard, the Appendices show both an Attestation of Compliance letter for Merchants and Service Providers.  This simply codifies the required items previously requested in this letter.  We now have a template that everyone can use in a uniform manner.

Scoping and Sampling

  • Finally, there is even a flow-chart Appendix on “Determining PCI DSS Scope” and “Determining PCI DSS Samples”.  The scope here only refers to how an assessor might reduce the scope of the cardholder data environment (CDE) with respect to sufficient segmentation.  The sampling chart shows how one might go about sampling an environment.

PCI Blog – Compliance Demystified » Blog Archive » PCI DSS version 1.2 differences and updates.

HHS inspector general blasts CMS over lack of HIPAA enforcement

WASHINGTON, D.C. — The inspector general’s office at the Department of Health and Human Services (HHS) has criticized the Centers for Medicare and Medicaid Services (CMS) for its soft enforcement of the Health Insurance Portability and Accountability Act (HIPAA).

In a 19-page letter addressed to acting CMS administrator Kerry Weems, the inspector general’s audits of hospital security show “numerous, significant vulnerabilities” that put patient data “at high risk.”

According to Modern Healthcare, the report said the CMS has done a good job establishing a mechanism to receive complaints from the public about security issues at healthcare organizations and also has effectively followed up with those organizations to remedy problems mentioned in the complaints. But that method alone was not enough to adequately safeguard patient information.

For a copy of the report, click here.

HHS inspector general blasts CMS over lack of HIPAA enforcement.

FISMA bill could add $150 million to agencies’ costs

 

An information security bill in the Senate could add $150 million annually to agencies’ current expenses if it became law, a government report issued today estimates.

 

The Federal Information Security Management Act of 2008 (S. 3474), approved Oct. 1 by the Senate Homeland Security and Governmental Affairs Committee, would require agencies to perform additional audits and evaluations of the government’s information systems.

Based on information from the Office of Management and Budget and other agencies, the Congressional Budget Office estimated the new requirements would add two percent to three percent to current FISMA expenses, according to its report.

Agencies spent nearly $6 billion in fiscal 2007 on requirements related to FISMA, the report states.

Also, the CBO estimates it would take about four years to meet the legislation’s requirements for the approximately 10,000 federal computer systems currently operating. The CBO estimated that upgrades to meet those new requirements and authorities would increase costs by $40 million of the $150 million in 2009 and about $570 million from 2009 to 2013, according to the report.

The original FISMA law created a comprehensive framework to ensure agencies have secure controls over information supporting federal operations and assets.

In addition to current requirements, the bill would create a chief information security officer council to establish best practices and guidelines for securely maintaining information. The bill also would strengthen the role of each agency’s CISO by giving them additional authorities, would require standardized information security audits and would impose a variety of new reporting requirements. In addition, the Homeland Security Department would be required to test the security of government information systems.

With members of Congress focused on elections, the legislation has little, if any, chance of passage, but some observers have said it raises important issues. 

FISMA bill could add $150 million to agencies’ costs

Hedge Your Bets: The Importance of IT Risk Management in M&A

Information & technology (IT) is a critical component in achieving an M&A strategy; without effective IT risk management, the value of the deal could be threatened or even eroded. IT risk management is a multi-disciplinary undertaking, and covers a variety of functional domains—ranging from data protection to change management. (See “Common IT Risk Management Areas” below) It is also a multi-faceted and complex undertaking that also entails consideration of a wide array of compliance requirements. As such, in a business environment with increasing emphasis on regulatory compliance, the role of IT risk management becomes more important as an enabler of the M&A strategy.

Often, many organizations need to demonstrate compliance with several overlapping requirements. A large financial company may need to meet Sarbanes-Oxley (SOX), Gramm-Leach-Bliley Act (GLBA), Payment Card Industry data security standard (PCI), Health Insurance Portability and Accountability Act (HIPAA), and other mandates such as those from the Federal Financial Institutions Examination Counsil, Office of the Comptroller of the Currency, and Federal Trade Commission; a global transportation company may need to meet SOX, HIPAA, PCI, FTC, and European Union and Asia-Pacific Economic Cooperation data protection requirements. The effort to meet these regulations often further complicates the efforts required to identify an approach and develop a strategy to mitigate risks when consolidating or separating companies.

Although many of these regulations address similar requirements such as data protection, access controls, transaction auditing, data availability and system monitoring; compliance with one set of regulations does not necessarily translate into compliance with another. The specifics of each set of regulations must be carefully evaluated.

Furthermore, international M&A transactions are likely to be much more complex than domestic transactions. In international transactions, companies must not only consider the regulatory compliance concerns noted above; they must also take into account the potential risks to corporate risk governance, employee data rights, customer data expectations, cross-border data flow, as well as the risk and compliance culture of the home countries of all entities involved in the M&A transaction. Failure to adequately address these factors could scuttle the transaction.

In this complex risk environment, it is clear that IT risk management must be effectively implemented to effectively address the myriad legal, regulatory, contract, and compliance requirements; otherwise, IT risk issues left unaddressed could fundamentally affect the overall M&A strategy and desired value creation.

Is the Loss of Business Value Real?
Based on Deloitte’s experience with M&A transactions, when IT risks, especially those risks that are compliance-driven, are not fully addressed, they can completely undermine the expected value creation of an M&A transaction. Generally, IT risk tends to impact M&A deal value in four primary areas: IT cost, EBITDA, technology, and regulatory and governance.

Examples of common IT risk issues that can have a serious negative impact on M&A transactions include:

  • Inevitable technology changes occur with disparate systems in combined entities and often create system consolidation delays and increase the security and compliance risks with the existing systems
  • The combined entity creates a new state, federal, and/or global jurisdiction operating footprint that often faces potential regulatory and financial risk from the possible compromise of personally identifiable information (PII)
  • The listing of IT assets assumed to be acquired during the financial due diligence process does not reconcile with detailed IT-listed assets, which results in lost value transfer
  • Unclear legal rights over existing key applications and information often inhibits integration and/or separation of IT systems
  • Sensitive information cannot be identified and located, which impedes, and can completely halt, application and system integration and/or isolation
  • The merged entities have disparate access management systems, but they have a need for immediate access to information, which often results in poorly consolidated systems that lead to segregation of duty conflicts and improper data access
  • Hidden liabilities in licenses and third-party contracts results in lost value and increased legal costs
  • Dated technology prevents customization and leads to lost business agility, opportunity and value

So, what is needed to minimize these types of risks from compromising an M&A transaction?

The IT Risk Management Framework
To mitigate the risks described above, M&A due diligance teams should incorporate a comprehensive IT risk management framework and readiness diagnostic into their planning and implementation efforts.

A sound IT risk management framework and readiness diagnostic has several key qualities. First, it is structured, risk-focused, and customizeable to cover small and large organizations. Next, it helps in the translation of information protection and technology issues into business risk impacts that will affect the overall M&A transaction. Finally, it helps address industry standards and regulatory requirements for each of the IT risk areas higlighted earlier in this paper.

The IT risk management framework and readiness diagnostic can be organized around five core components — integrated requirements, technology assessment, information assessment, business assessment, and risk quantification.

Integrated requirements establish the required IT risk management practices to be assessed during the M&A transaction. Assessment practices and criteria are established by identifying and aligning the applicable IT risk-related business requirements for each of the common IT risk management areas (see above). These should include:

  • Industry common practices (e.g. International Organization for Standardization (ISO) 27002, COBIT 4.1, Information Technology Infrastructure Library (ITIL), American Institute of Certified Public Accountant’s (AICPA) Generally Accepted Privacy Practices, etc.)
  • Laws and regulations (e.g. GLBA, HIPAA, EU Privacy Directive, CA SB1386, FTC Standards for Safeguarding Customer Information, etc.)
  • Industry standards (e.g. PCI Data Security Standard, BITS, etc.)
  • Acquiring and acquired organizations’ internal IT risk-related policies and standards for each of the common IT risk management areas previously mentioned

This particular IT risk management component is especially benefical to those organizations that worry about compliance such as How does the “new” operating structure comply with SOX quickly?’ By establishing and evaluating integrated requirements early in the IT due diligence process, the acquiring organization should have already identified the SOX related requirements and their impact on the other organization’s operations. Once the M&A transaction has been executed, the acquiring organization should be able to quickly apply their SOX control framework to the acquired organization and assimilate the various reporting entities into the new organization’s compliance testing and reporting process.

A Framework for Value Protection

The technology assessment considers core technology development, licensing and integration issues. Generally, this assessment will consider:

  • Technology software and infrastructure vulnerabilities that may affect service levels
  • Capacity and scalability of key systems to satisfy business requirements
  • System backup and power issues that may cause business disruptions
  • Unsupported systems and code
  • Vendor-owned source code that is not available for changes
  • Vendor service-level adequacy
  • Non-favorable clauses in vendor agreements that would be affected by change in ownership
  • Termination of key employees
  • Loss of quality resources required for integration efforts
  • Legal rights to existing key applications
  • Source code that is not in escrow
  • Hidden liabilities in licenses and support contracts

The information assessment considers sensitive data-handling requirements and how well data is protected. Generally, this assessment will consider:

  • Systems and data accessible by unauthorized users and how unauthorized access to such data can affect the company’s brand and reputation
  • Authorization, development, and approval processes for the records program
  • Privacy, intellectual property, and other sensitive information collection, usage, storage and complaints-handling processes
  • Third party contractual arrangement adequacy for addressing sensitive information handling

The business assessment considers technology strategy alignment with the business, business process control integrity & automation, and governance & compliance matters. Generally, this assessment will consider:

  • IT strategy that is not aligned with the current and future business requirements
  • Current systems that are not suitable for business requirements
  • Inefficient manual work-around procedures that are required to operate the business
  • Level of system automation that does not match the level disclosed by management
  • Recently-integrated business systems that have internal control integrity issues
  • Internal controls and SOX 404 issues that will impact regulatory compliance
  • Insufficient governance of IT system projects that could result in hidden future IT costs or write down of IT assets due to inappropriate system development

The risk quantification translates identified IT risks into financial impact statements and helps prioritize them for consideration in the final M&A transaction decision.

Today’s risk and compliance environment compels organizations that are developing M&A strategies to integrate IT risk management into their M&A planning and implementation processes. Left unaddressed, IT risk issues can fundamentally affect the overall M&A strategy and desired value creation. A properly structured IT risk management framework and readiness diagnostic can provide practical insights into the information and technology risk issues. Including IT risk management from the outset can make the M&A picture complete, rather than an unfinished puzzle. ##

Bill Kobel(bkobel@deloitte.com) is a Principal and John Gimpert (jgimpert@deloitte.com) is a Partnerwith Deloitte & Touche LLP.

Hedge Your Bets: The Importance of IT Risk Management in M&A.

VeriFone Takes Lead in Securing Card Payments with PA-DSS – MarketWatch

PAY 10.58, +1.27, +13.6%) , today announced an aggressive program to ensure implementation of the PCI Security Standards Council’s (PCI SSC) Payment Application Data Security Standard (PA-DSS). This program establishes a comprehensive PA-DSS compliance policy aimed at ensuring protection of cardholder information across virtually all merchant environments and all types of card acceptance devices.

VeriFone expects rapid availability of its terminal-based payment applications to meet all needs of acquirers and merchants in complying fully with the PA-DSS mandate. PC- and server-based VeriFone applications such as PAYware PC already comply with PA-DSS or its predecessor, the Visa Payment Applications Best Practices (PABP). PA-DSS is intended to ensure secure payment applications do not store prohibited data, such as full magnetic stripe, CVV2, PIN or other sensitive data, and are compliant with the PCI Data Security Standard (PCI DSS).

First published in April 2008, PA-DSS expands upon PABP to encompass card acceptance devices known as “stand-alone POS terminals,” which are commonly used by smaller “level 4” merchants who represent the largest installed base of payment acceptance devices globally. It also encompasses consumer facing payment devices and programmable PIN pads that are connected to electronic cash registers in use at larger “level 1 and 2” merchants.
Merchants are increasingly utilizing these systems in a manner that brings them under PA-DSS requirements, leading VeriFone to establish a universal compliance program for all of its applications used in its programmable payment acceptance devices going forward, initially targeting the US/Canada market. Because each payment application certified by each bank, processor or acquirer must now be audited, full PA-DSS compliance will result in hundreds of individual audits by qualified assessors. Auditing device-based payment applications at the supplier level will minimize the number of audits required and lower compliance costs for buyers.
“Adherence to the PA-DSS by vendors is an excellent way organizations can ensure the utmost in transaction integrity. Providing customers with only PA-DSS audited applications will help us further standardize security levels industry-wide,” said Bob Russo, general manager of the PCI Security Standards Council.
The PCI-SCC was founded by American Express, Discover Financial Services, JCB International, MasterCard Worldwide, and Visa Inc. to enhance payment account data security by driving education and awareness of the PCI Security Standards.
“There is nothing more important to this industry than a consumer’s trust in the payment system and VeriFone applauds this bold step by the PCI SSC to create a third-party validation testing program that positively verifies compliance to the PA-DSS standard and ensures protection of sensitive cardholder information,” said VeriFone Chief Security Officer Dave Faoro. “We are taking this bold step to ensure that banks, acquirers and merchants can easily comply.”
According to the PA-DSS mandate, POS terminals that encompass payment applications must be audited by a PA-QSA laboratory unless they are utilized in very limited environments that reduce the possibility of compromise. These restrictions stipulate that the payment device should have no connection to any of the merchant’s systems or networks, that they connect to the acquirer or merchant via a private line, that they can be securely updated remotely, and that sensitive authentication data is not stored. The overwhelming majority of “stand-alone POS terminal” payment applications being certified today by leading processors no longer meet all of these usage restrictions, so therefore fall under the scope of the PA-DSS compliance mandate.

VeriFone Takes Lead in Securing Card Payments with PA-DSS – MarketWatch.

PCI awareness hits all time high – 28 Oct 2008 – CRN

UK firms are more aware than ever before of the need to be compliant with the Payment Card Industry (PCI) standard.

Figures released by customer interaction specialist The Logic Group, which questioned several hundred public sector and commercial organisations in the UK, showed that the amount of firms holding back over becoming compliant has dropped by almost half.

This is mainly due to the increasingly sophisticated and high profile cases of card security breaches in the media, resulting in almost total awareness of the standard.

Last year, 11 per cent of organisation were found to be fully compliant, according to the Logic Group’s figures, but in 2008 a total of 15 per cent reported full compliancy. In addition a further 54 per cent are in remediation and 34 per cent expect to be compliant within the next six months.

Robin Adams, director of security risk and compliance at The Logic Group, said: “For the first time since we started to monitor progress through the survey four years ago, we’re seeing the majority of respondents believing that their business will benefit from implementing PCI. We feel this is the year of the final push – where we’ll break the back of the compliance problem. With total UK card fraud losses increasing by 14 per cent in the first half of this year – the impetus has never been greater.”

Earlier this month the advisory board of the Security Council responsible for setting the PCI DSS guidelines met in Brussels to discuss progress in Europe, and developments for the standard’s implementation.

Bob Russo, general manager of the PCI SCC, said: “The Logic Group PCI survey has been a useful resource in providing a barometer of progress towards PCI compliance within the UK for the past four years. The UK is now firmly on the way towards meeting PCI compliance.”

PCI awareness hits all time high – 28 Oct 2008 – CRN

Encryption Tech: 10 Simple Rules for Encrypting Enterprise Data

Enterprises are becoming more and more proactive about data security, with data encryption viewed as a core element to their defensive measures. They are adopting encryption at a rapid rate to comply with industry regulations, protect intellectual property, obtain safe harbor from data breach disclosure laws, and effectively manage risk. As encryption proliferates, IT professionals are making critical decisions that directly contribute to, or detract from, an organization’s ability to effectively manage encryption keys and data security.

Data is an organization’s most valuable asset and it must be protected. Designing and implementing an encryption strategy is not complicated if you understand the needs of your enterprise and establish the right decision-making criteria for encryption solutions.

Simplicity, breadth, manageability and efficiency are the primary requirements security-minded enterprises must build into their encryption strategy. A solution that has the fewest complications will make the jobs of IT professionals easier, be more cost- and time-efficient, while at the same time protect data and meet compliance standards. Here are ten simple rules for evaluating encryption and key management solutions to ensure that the investments you make today deliver strategic value for the future.

1. Encryption shouldn’t have to be painful

Encryption is necessary to secure data at its source. It provides safe harbor from data breach disclosure laws and is mandated by industry standards such as the Payment Card Industry Data Security Standard (PCI DSS) and Health Insurance Portability and Accountability Act (HIPAA). Why then are many enterprises hesitant to adopt encryption? Often, the thought of high implementation costs, changes to applications, complexity and performance degradation prevents enterprises from making smart decisions regarding data security. Encryption technology has evolved immensely in the past few years. New approaches offer cost-efficient manageability combined with stellar performance and application and database transparency. Instead of dealing with negative perceptions about encryption by disregarding the issue, spend some time learning about the new approaches to database, application and file encryption.

2.  Beware of point encryption product explosion

The more encryption products an organization has, the bigger the system management and policy management problem becomes. Avoid ending up with an exploding number of encryption products and all the related key management and policy management headaches that this will bring. Selecting encryption solutions that have the broadest coverage over the largest number of potential systems will eliminate management headaches, as well as homogenize and consolidate data security policy management.

3.  Understand the EKM problem/solution area

The primary purpose of an Enterprise Key Manager (EKM) is to provide a centralized point of key generation, key lifecycle management, key backup and key recovery. When developing an enterprise encryption strategy, it is important to remember that the need for an enterprise key manager grows in line with the number of points for key storage. In addition, enterprise key managers are passive, meaning that they do not actively control the security of the encryption keys as they are handled by the encryption system. Furthermore, a complete encryption solution also includes secure access controls to prevent unauthorized access to sensitive data. This is not something that can be addressed by an EKM alone. A comprehensive encryption strategy requires a hard look first at the methods by which keys are handled by the encryption system and,

second, at the overall key management complexities associated with the enterprise encryption program.

4.  Understand the importance of IKM

Integrated Key Management (IKM) is the actual key management structure of an encryption system. This key management process actively controls the methods by which keys are stored and accessed by the encryption system. IKM differs from EKM in that it directly controls the security, storage and handling of keys as they are accessed within the encryption solution. Integrated key management must be a critical part of the evaluation criteria for any encryption solution. If integrated key management is secure and transparent, the management overhead of an encryption solution will be significantly minimized.  It is critical to remember that the need for an EKM will grow directly in line with the number of encryption systems that have been adopted because backup and recovery of encryption keys will become a larger and larger problem as the number of places that keys are held also grows. Selecting solutions that provide integrated key management for the largest number of required encryption end points will go a long way towards eliminating the enterprise key management problem.

5.  Learn from the success of SSL

SSL is the most broadly used encryption method in the world today simply because it is transparent to users and applications and is easily managed.   As demonstrated by the success of SSL, the more transparent the encryption solution, the more easily it can be integrated and supported for the long term. Organizations seeking success in implementing encryption should make transparency an important part of their decision-making criteria.  Without transparency, encryption solutions can take as long as twelve months to install and cause significant costs during application change processes.  With transparency, encryption can be implemented within days, and never needs to be considered as an inhibitor as the organizations seeks to optimize their information management programs.

6.  Look beyond the column

While column-level encryption can intuitively seem like the most practical method to encrypt database data, its invasiveness and lack of scalability make it inefficient, offer limited protection and, sometimes, render it unusable. Column-level encryption is not transparent to databases or applications. This lack of transparency can drastically complicate application change management requirements, require a significant amount of customization of both the database and the application and places the performance burden directly on the database itself. Furthermore, as projects to protect a single column, such as credit card numbers, evolve into broader data protection requirements for personally identifiable information (PII), the number of columns that need to be encrypted explodes, drastically hurting database performance and raising implementation costs. Most importantly, databases send sensitive information to a myriad of locations, including database log files, application log files, document outputs to servers, and backups.  Column-level encryption offers no protection for unstructured data. Before leaping to undertake a potentially extensive and long column-level encryption project, organizations should fully educate themselves on the costs and benefits of every approach to database encryption, including database file level and column-level encryption.

7.  Prepare for virtualization

Virtualization changes the overall security model. Because the operating system is not tied to a physical disk, it can be moved from system to system. Full disk encryption and physical security, which have been broadly implemented to protect operating environments and the data housed within them, lose their effectiveness in virtualized environments. Instead of stealing the physical disk, entire operating environments can be logically accessed and easily transferred.  Organizations that plan on implementing virtualization should re-evaluate their data and system protection mechanisms in light of the new security risks.  Implementing data encryption that travels with the operating system environment in conjunction with or instead of full disk encryption will go a long way as the use of virtualization exponentially increases throughout enterprise infrastructure.

8.  Policy is key

Encryption is easy – getting decryption right is hard! By combining encryption with an access control based decryption policy, the value of encryption grows from a simple scrambling of bits as a physical theft deterrent to a dynamic data security solution that places controls directly on the data itself. To gain strong security benefit from their encryption projects, organizations should look for solutions that not only scramble bits, but apply security policies on the data itself.

9.  Consider all applications and operating systems

Many encryption solutions are tied to specific versions of  applications and operating systems. For example, enterprises typically have numerous versions of the same database across various parts of the enterprise. They also have numerous databases running on a wide array of different operating systems. While it seems natural to implement encryption solutions that come as part of the application, this leads to an explosion in the number of encryption solutions. If encryption is only available for a specific version of a database for example, but enterprises are unable to update all of their databases to the most up to date version, it leaves them with a hole in their overall security solution. Furthermore, training costs increase with a wide array of point solutions that are tied to the application or the operating system. Solutions exist today that, due to the transparent nature of their operation, can cover all applications across multiple operating systems. This allows the enterprise to deploy a single solution, reduce their key management issues and minimize both implementation and administration costs.

10.  Think of encryption as an enabler

Encryption can help your business. Data security is a proactive way to comply with government and industry regulations and ensure customer confidence. Regulations like the Gramm-Leach-Bliley Act (GBLA) of 1999, California SB 1386, California AB1950, HIPAA, and PCI DSS require enterprises to protect sensitive information with penalties for noncompliance, such as hefty fines and litigation. In addition, in the event of a data breach, an enterprise will suffer damage to its reputation and the loss of customer confidence. By using encryption, an enterprise demonstrates its proactive dedication to data protection.

Encryption should no longer be feared! Enterprises can find effective, cost-efficient solutions and strategies that allow their business to gain the benefits of a broad data security program without changing their applications or requiring their administrators to learn multiple different solutions. Thanks to advancements in encryption approaches, solutions now exist that secure data without creating management complexities or performance nightmares.

Steve Pate is the chief technology officer for Vormetric.

Encryption Tech: 10 Simple Rules for Encrypting Enterprise Data | Computer Technology Review: Data Storage and Network Solutions.

Thales joins PCI Security Standards Council

Thales, a leader in information systems and communications security, announced today that it has joined the PCI Security Standards Council as a new participating organisation.

As a Participating Organisation, Thales will work with the Council to evolve the PCI Data Security Standard (DSS) and other payment card data protection standards.

The PCI DSS, endorsed by American Express, Discover Financial Services, JCB International, MasterCard Worldwide and Visa Inc., requires merchants and service providers that store, process or transmit customer payment card data to adhere to information security controls and processes that ensure data integrity. More information on the council and the standard can be found at www.pcisecuritystandards.org

As a Participating Organisation, Thales will now have access to the latest payment card security standards from the Council, be able to provide feedback on the standards and become part of a growing community that now includes more than 500 organisations. In an era of increasingly sophisticated attacks on systems, adhering to the PCI DSS represents an entity’s best protection against data criminals. By joining as a Participating Organisation, Thales, is adding its voice to the process.

“The PCI Security Standards Council is committed to helping everyone involved in the payment chain protect consumer payment data,” said Bob Russo, General Manager of the PCI Security Standards Council. “By participating in the standards setting process, Thales demonstrates it is playing an active part in this important end goal.”

Paul Meadowcroft, Head of Transaction Security for the Information Systems Security activities of Thales, said: “As a leader in the provision of security solutions to the banking and finance sector, we endeavour to be at the centre of the continuous improvement of payment security standards and processes. We believe that the PCI DSS is a strong defence in the fight against data theft. By joining the PCI Security Standards Council, we are committed to advancing the understanding and adoption of PCI DSS and we look forward to collaborating with other Council membebers for the benefit of the wider industry in the future.”

Finextra: Thales joins PCI Security Standards Council

Data Breaches at State, Local Agencies Expose Data about Millions

 

 

October 21, 2008 • by William Jackson

Data breaches at state and local government agencies exposed the personal information of nearly 3.8 million Americans in the first three quarters of this year, according to the Privacy Rights Clearinghouse.

Most of the exposures came from a single incident in July at the Colorado Division of Motor Vehicles that compromised information on 3.4 million people. But even discounting that incident, the number of records exposed in breaches at state and local agencies outstripped those reported at federal agencies in the same period.

The figures underscore the need for standardized and improved data security at state and local government agencies, said Abe Kleinfeld, president and chief executive officer of nCircle Network Security Inc., of San Francisco.

“I don’t think we are seeing an unusual amount of data breaches” at the state and local levels, Kleinfeld said. “The danger is the kind of data they have. It is becoming increasingly important that states begin developing some kind of program.”

The company compiled the data on state and local breaches from the Privacy Rights Clearinghouse, which documented 20 breaches through September.

In the same period, the clearinghouse reported five incidents of breaches at federal agencies that exposed the records of 23,024 people. The largest was in May at the Marine Corps Reserve Center in San Antonio, where a contractor improperly accessed and stole 17,000 records. Another incident at the International Visa Service in Atlanta involved an employee’s theft of data on 1,000 people.

The lower federal numbers illustrate improvements in the government’s data security, which Kleinfeld said can be attributed largely to the standardized processes and controls required under the Federal Information Security Management Act.

“There are a lot of complaints about FISMA, but I think it is hard to argue that security has not improved in the federal government,” he said. “It has improved.”

States remain vulnerable because there is no similar overarching standard for data or information system security, he said. “We need some kind of program like FISMA that extends across state and local governments,” he added.

Imposing a nationwide standard for government data security would be difficult, and it is unlikely to happen in the short run, Kleinfeld said. But FISMA-like requirements could eventually be extended to state and local agencies that administer federal programs or share data with federal agencies.

In the meantime, Sen. Norm Coleman (R-Minn.) introduced a bill last month that could help. S. 3460, the State Cyber Security Protection Act of 2008, would give the Homeland Security Department $25 million to fund a pilot program to support cybersecurity efforts at the state level. The bill has been referred to the Homeland Security and Governmental Affairs Committee.

A program to share best practices among agencies at all levels of government and create cybersecurity templates, even if they are not mandated, would be a big step forward in data security, Kleinfeld said.

Security breaches with exposure of personal data at the state level, as reported by the Privacy Rights Clearinghouse through September, include:

Florida Department of Children and Families — 1,200 records exposed.

Maryland Department of Assessments and Taxation — 900 records exposed.

Wisconsin Department of Health and Family Services — 260,000 records exposed.

Virginia Department of Social Services — 1,500 records exposed.

Wisconsin Department of Revenue — 5,000 records exposed.

South Carolina Department of Health and Environmental Control — 400 records exposed.

Nevada Department of Public Safety — 109 records exposed.

Utah Division of Finance — 500 records exposed.

Pennsylvania Department of State — 30,000 records exposed.

Rhode Island Department of Administration — 1,400 records exposed.

Oklahoma Department of Corrections — 10,597 records exposed.

Baltimore Highway Administration — 1,800 records exposed.

Oklahoma Corporation Commission — 5,000 records exposed.

Connecticut Department of Labor — 2,100 records exposed.

California Department of Consumer Affairs — 5,000 records exposed.

Texas Department of Public Safety — 826 records exposed.

Florida Agency for Health Care Administration — 55,000 records exposed.

Colorado Division of Motor Vehicles — 3.4 million records exposed.

California Department of Consumer Affairs — 5,000 records exposed.

Pennsylvania Department of Public Welfare — 2,845 records exposed.

William Jackson is the senior writer for Government Computer News (GCN.com). You can contact William about Data Breaches at State, Local Agencies Expose Data about Millions at bjackson@1105govinfo.com.

Redmond | News: Data Breaches at State, Local Agencies Expose Data about Millions

National Survey Finds Most Companies Expect to Be Compliant with PCI Standards within 18 Months

 

Graph

National Survey Finds Most Companies Expect to Be Compliant with PCI Standards within 18 Months

 

Findings indicate authentication and access among top priorities; 44 percent have deployed two-factor authentication; 26 percent aim to go beyond compliance to deploy best practices and technologies

LEXINGTON, Mass.–(BUSINESS WIRE)–Imprivata®, Inc., the converged authentication and access management company, announced the results of a national survey examining Identity Management Trends in PCI Compliance 2008, covering the state of Payment Card Industry (PCI) data security standards (DSS) and compliance spanning companies over a cross-section of industries. Timely with the PCI Data Security Standard 1.2 being recently released on Oct. 1, 2008, this online survey of IT decision makers covered companies of all sizes and highlighted trends and the role of authentication and access technologies in achieving compliance.

Survey Facts

The time is now for most companies to select, buy and deploy technologies to achieve compliance within 18 months:

Companies across a variety of industries must comply with the PCI DSS requirements or risk steep penalties and fines – most deem compliance very important to avoiding unnecessary risk and related costs. Many firms are actively engaged in the PCI DSS compliance process by examining the specific requirements, retaining a consultant and/or implementing technologies to satisfy the industry mandates.

Despite the latest PCI DSS compliance requirements deadline having passed in June 2008, only 39 percent of respondents confirmed they are currently compliant

Of the 61 percent of respondents that are not yet compliant, 53 percent expect to become compliant within 12 months; 65 percent expect to be compliant within 18 months

90 percent of those respondents not yet compliant view PCI DSS compliance as important; 44 percent consider it very or extremely important

Authentication and access technologies are clear priorities to achieving PCI DSS compliance:

The PCI DSS regulations cover twelve specific areas across IT disciplines, with many tied to authentication and access technologies that are the current focus of investments for respondents’ compliance efforts. Many respondents have outlined specific authentication and access technologies as areas they still need to invest in to satisfy compliance requirements and to achieve key security objectives overall.

To control individual access to computing resources and cardholder information, 74 percent have assigned a unique user ID, 63 percent have deployed strong authentication technologies and 63 percent have deployed password management technologies

35 percent of respondents have already deployed single sign-on (SSO), and 39 percent have deployed physical access security cards

In pursuit of PCI DSS compliance to satisfy the 12 specific regulations: 68 percent of respondents have already restricted access to cardholder data based on need-to-know; 73 percent have assigned a unique ID to each person with computer access; 75 percent restrict physical access to cardholder data; 70 percent track and monitor all access to network resources and cardholder data

Companies are moving beyond simple ‘check-box’ compliance to deploy best-of-breed security technologies and establish best practices:

As companies work towards meeting the PCI DSS mandates, there is a group of respondents that are concerned with more than simple compliance. Instead, while interested in compliance, their primary driver is to improve their security in a holistic manner.

      — 26 percent of those not yet compliant aim to have the best security available in the industry to protect data

— 31 percent acknowledge the risk of significant penalties is their primary driver for achieving PCI DSS compliance

The study was conducted in June and July 2008, culminating in 64 responses from IT decision-makers across the U.S. spanning every major industry.

National Survey Finds Most Companies Expect to Be Compliant with PCI Standards within 18 Months