The cost of achieving regulatory security compliance is on average $3.5 million each year, according to a survey of 160 individuals leading the IT, privacy and audit efforts at 46 multinational organizations
On the heels of Forrester’s GRC Market Overview last month, this week we published my Governance, Risk, And Compliance Predictions: 2011 And Beyond report. Based on our research with GRC vendors, buyers, and users, this paper highlights the aggressive regulatory environment and greater attention to risk management as drivers for change
One out of every two IT security professionals spends 50% of the work week on regulatory compliance initiatives, according to a new survey.
# The organization must understand which frameworks or framework elements are needed to address, at a minimum, the critical security concerns. When addressing control requirements, more is not necessarily better, and each additional control entity represents an investment in time, money, and effort.
# Choose a base framework to use. An organization should identify a base framework to contain the additional controls. This framework should be as broad as is viable, allowing for only minimal, more specific needs to be addressed.
# Break the identified framework elements down according to functional areas and combine controls into like families or tiers. Different frameworks often contain equivalent controls under different headings or focus areas. By understanding where the controls map to one another, existing controls can often simply be enhanced rather than having to add completely different compliance needs.
# Identify critical controls that address the most restrictive requirements. In many situations, there will be control objectives that must be accomplished, intermingled with additional categories that are simply “good-to-have”. The action items that are required for compliance needs should be categorized as more critical.
# Define control “numbering system” and nomenclature. For ease of evaluation and tracking, the combined framework elements should be indexed in a way that allows them to be viewed as parts of a whole. In addition, a formalized control language should be used to address concepts across the new framework, avoiding confusion as compliance efforts begin.
# Identify affected data. Just as it was necessary in the first step to identify which controls and frameworks were needed, it becomes necessary to reverse the process, ensuring that all elements of data that are subject to the collected controls. The majority of this information was known at the start of the exercise, but a second glance after consolidating the requirements often identifies additional data sources, repositories, and systems.
# Understand data flows. As critical as it is to understand the affected data elements, it is just as important to understand where those data elements reside and why. How the information is collected, processed, stored, and transmitted is essential to determining in-scope systems, applications, and processes that must adhere to the new framework.
# Formally define scope of data controlled by the frameworks. After identifying the data flow patterns and practices, a consolidated list of servers, systems, applications, processes, and governance items must be created and then reviewed against expected values.
# Reduce data scope aggressively. Each data control element is an investment in time, money, and effort. The same can be said for each element of the in-scope data that is addressed by the combined framework. Existing business processes and needs should be used to determine if data is being used or retained in inappropriate or unneeded areas. Where possible, data should be consolidated and purged, reducing the overall scope of control coverage, especially critical control requirements such as those brought on by legal or regulatory provisions. (Editor’s note: see Ben Rothke and David Mundhenk’s guidance on reducing PCI scope.)
# Classify affected data according to impact. Some controls will be identified as more critical, and the data elements associated with these will likewise be viewed as more sensitive. These classes of information assets should be classified and labeled to ensure that adequate attention is applied.
# Define data lifecycle elements based upon classification levels and requirements identified by various standards and practices. Once the combined framework controls are in place; the data is identified, scoped, and minimized; and classification levels have been established, a comprehensive data lifecycle program should be implemented. Through this process, end users can manage data elements, complying with the chosen control framework requirements without having to conduct extensive research into sometimes arcane control sets.
# Review existing infrastructure, policy, and procedure against the consolidated framework and data lifecycle requirements. Governance and operational resources must be reviewed against the newly developed framework and associated lifecycle elements. Where needed, changes should be made to support the new controls system.
# Implement consistent solutions across all data elements located within the tier. The supporting processes that enable the controls effectiveness should be viewed from the perspective of consistent, modular growth. Networks, systems, and management tools should be designed to scale or be replaced easily. Consolidated security programs (such as incident response, vulnerability management, and change management) and scheduled requirements (audits, penetration testing, vulnerability assessments, risk assessments, and reports) should be updated to address all required controls across the entire framework, resulting in a consistent, singular approach to compliance and readiness.
The Office of Management and Budget (OMB) has finished its review of proposed rules related to changes to HIPAA privacy and security rules, meaning the rules could hit the streets this week.
The OMB reports that it has concluded its regulatory review of the rules HHS sent in April.
… When we asked the 379 respondents to our InformationWeek Analytics survey on regulatory compliance how many requirement sets their organizations are addressing, the No. 1 answer was four or more, at 35%.
New legislation continues to pass at a fast clip in the US under the new administration, some of the most revealing actions taken so far include:
- May 20, 2009 – President Obama signed the Fraud Enforcement and Recovery Act of 2009.
- June 12, 2009 – United States Congressman Gary Peters introduced his Shareholder Empowerment Act to the House.
- June 17, 2009 – President Obama outlined plans for more sweeping reform of financial regulations that would aim to consolidate supervision over all firms that pose a risk to the financial system as a whole.
SAS 70 audits and PCI DSS assessments are fast becoming two of the most widely recognized and “must have” compliance initiatives for many businesses in today’s growing regulatory environment. Sarbanes Oxley, HIPAA, and other federally mandated legislative acts have pushed Statement on Auditing Standards No. 70 (SAS 70) into the forefront of compliance. Similarly, the Payment Card Industry Data Security Standards (PCI DSS) assessments have also become a widespread compliance mantra affecting thousands of businesses across the globe. And as with any compliance mandate, particularly SAS 70 and PCI DSS, an enormous amount of time and effort are required for achieving overall success.
One of the key requirements for compliance with PCI DSS (the Payment Card Industry Data Security Standard) is that organisations block all non-approved channels of communication, screen all traffic and prohibit direct routes for inbound and outbound internet traffic. The trouble is many organisations forget about the communication traffic they cannot see, ones that use highly evasive techniques and are easily able to circumvent traditional security methods used to control the network.
Today’s workforce expects instant messaging and other real-time communications tools including web conferencing, Voice over IP, and social networking to be ‘always on’, just as their predecessors viewed email.
The problem is Web 2.0 applications like IM, Skype and the chat functions within Facebook can easily traverse the network without being seen, potentially allowing credit card information to leave the organisation unauthorised. If they cannot be seen then they cannot be managed or secured, resulting in a significant risk of violating PCI compliance.
In a recent study of data collected from sixty FaceTime customers there were over 51,000 individual requests for Facebook – 30% of these were for Facebook chat. With 95% of all access requests for social networking sites being allowed by policy it is a sobering thought to those with the responsibility of compliance.
Real-time communications is big business and companies such as Yahoo!, AOL and Skype develop their applications to get as many users as possible signed up to their network, rigorously testing client applications against standard enterprise security infrastructures to ensure their application can tunnel through. Many applications use encrypted protocols, making it impossible for an Intrusion Protection System to detect or to control them.
In addition, they use peer-to-peer connections. Skype, for instance, uses a peer-to-peer connection and is encrypted end-to-end, often even tunnelling through HTTP if that is the only port that it finds open on the firewall, negating the use of a URL filtering solution to control it. Consequently, many organisations do not even realise that their users have installed real-time communications applications.
Should companies look to ban such technologies? The general consensus is no, though the jury is out on Skype (but that’s another story). Industry analysts such as Gartner say that companies should look to embrace such tools along with enterprise versions such as Microsoft OCS and Lotus Sametime. Not just for their telephony savings, but for their recognised benefit of increasing productivity and collaboration within the work place.
However, even companies implementing Unified Communications (UC) should be aware that though some management and control is provided with enterprise-grade solutions, it doesn’t natively provide everything required to comply with many regulatory standards such as the Data Protection Act, let alone compliance with PCI DSS.
In addition, a lack of standards may still see employees trying to install other client software so that they can communicate with friends not using that UC tool, often exacerbating the problem.
Fully blocking rogue communication applications requires more than a traditional firewall. The first step to take is to understand the status quo, getting a thorough understanding of what employees are currently doing on the internet. There are free tools available that provide a deep look at exactly what is traversing the enterprise network, and the results are almost always surprising. Organisations that believe they have these applications locked down tend to be amazed when they discover the actual instances of unauthorised traffic on their network. Blocking ports on the firewall and disallowing access to specific URLs doesn’t cut it anymore.
Once companies have visibility of all traffic on their networks, it is then possible to apply policies to allow or block users and for those applications such as IM that are allowed, to enforce hygiene, content filtering and compliance logging. Only then will businesses be certain that they have covered some of the basics of PCI DSS compliance.
December 9, 2008, 01:01 PM — CSO —
In a country that’s seen many regulatory compliance challenges this decade, the headaches of PCI security tend to be analyzed from a largely American perspective.
In the process, companies tend to forget that PCI compliance has been a recipe for international indigestion.
“Remember that credit cards are used abroad, and many American companies have personnel handling credit card transactions in offices all over the world,” says Bruce Larson, security director at American Water, a major water utility that employs more than 10,000 people. “If you have a multinational organization, your data is not just sitting in the U.S.”
There may be some irony in hearing that from someone whose concerns are mostly based on security threats inside the U.S. Larsen has to worry about everything from cyberattacks targeting computerized water filtration systems to terrorists who might try to bomb pipelines or poison the water supply. He also loses sleep whenever there’s the chance of a natural disaster.
The inconvenience of online, global commerce
But more people are using credit cards to pay the water bill online, and he knows the credit card data is floating around in databases outside the U.S. Losing any of that data could be a body blow in terms of public confidence. Then there’s the fact that American Water does business with vendors across the globe.
“I have a very geographically distributed network — more than 1,500 locations where humans work, 150-200 of those are critical operations facilities,” Larson told attendees during a PCI security seminar CSOonline held in New York in September.
For Harshul Joshi, director of IT-risk and advisory services at CBIZ and Mayer Hoffman McCann P.C. (MHM), a professional business services company, doing business internationally can make for a lot of confusion regarding the PCI security ground rules.
“When we deal with non-U.S. companies, there is often confusion over what PCI security requires,” Joshi says. “We work with one of the largest magazine publishers with operations around the globe and if you dial an 800 number, chances are you’ll be talking to someone in a call center in Vietnam. You give your credit card number and it is recorded somewhere outside the U.S.”
On the outside looking in
If a company is based outside the U.S. — in Sweden or Ukraine, for example — the problem is usually a lack of communication and money regarding PCI security needs.
Dmitriy Tsygankov, director of the corporate customer care center at a bank based in Europe, says Visa USA tends to offer American companies more incentives and assistance for their compliance efforts. As an example, he mentions the US$20 million in financial incentives Visa USA offered nearly two years ago to encourage quicker adoption of the standard.
“Why does Visa USA offer merchants a $20 million bonus to become compliant and not other regions?” he asked. He suspects it’s because e-commerce is more popular and profitable in the U.S. In the bigger picture, he says, it can be harder for foreign companies to come up with the cash needed to achieve compliance.
No financial incentives were mentioned in a recent statement from Visa Inc. announcing new global PCI compliance deadlines. Under the deadlines, announced last week, global merchants and service providers must show by Sept. 30, 2009 that they are not storing full magnetic stripe data (track data), security codes or PIN data after a transaction is approved. Sept. 30, 2010, is the deadline for all service providers and Level 1 merchants to file compliance reports.
David Taylor, founder of the PCI Knowledge Base, agrees companies outside the U.S. don’t enjoy the same degree of financial support. “There really are no global incentives, just a marketing pitch in the Visa Global PCI Deadlines announcement last week to service providers,” he says.
Visa spokesperson Rosetta Jones confirmed Monday that the company does not currently offer any financial incentives for merchants outside the U.S.
“While Visa USA did offer some monetary incentives for U.S. merchants for a short period of time, the major motivator for merchants to achieve compliance has been their desire to properly protect cardholder data and to prevent being the target of a data compromise,” she says.
Keep the global perspective
Regardless, security experts agree companies must look at PCI security as a global mandate and ensure that the same controls used in the U.S. are being used elsewhere. There’s a danger of that not happening when companies find themselves deep in the weeds trying to get their arms around the sheer scope of the standard, says Daniel Blander, a CISM, CISSP and president of Techtonica Inc. in Los Angeles.
His advice is to not let the scope of the challenge get the better of the organization, and use every remediation and control to give something back to the business that provides a non-PCI return on investment.
“File integrity monitoring is great for improving the quality of implementations and maintaining configuration standards if used correctly; configuration standards can improve the delivery of services and systems by promoting consistency,” he says, noting that’s good for business as a whole — wherever in the world the company operates from.