As merchants work to reduce the scope of PCI compliance and the risk due to having credit card data in their environment, some companies are actually taking access to this data away from people who need it to do their job, including the managers who are charged with investigating fraudulent credit card transactions. Instead of PCI controls helping reduce fraud, for some companies, they are making fraud detection more difficult.
SAS 70 audits and PCI DSS assessments are fast becoming two of the most widely recognized and “must have” compliance initiatives for many businesses in today’s growing regulatory environment. Sarbanes Oxley, HIPAA, and other federally mandated legislative acts have pushed Statement on Auditing Standards No. 70 (SAS 70) into the forefront of compliance. Similarly, the Payment Card Industry Data Security Standards (PCI DSS) assessments have also become a widespread compliance mantra affecting thousands of businesses across the globe. And as with any compliance mandate, particularly SAS 70 and PCI DSS, an enormous amount of time and effort are required for achieving overall success.
Compliance was already on every manager’s mind before Heartland Payment Systems reported that a breach early this year cost it $12.6 million during Q1, 2009 in expenses and accruals.
Of those costs, $6 million were in fines from MasterCard and almost $1 million from Visa for alleged failures in PCI compliance.
WAKEFIELD, Mass., Apr. 20, 2009 — The PCI Security Standards Council, a global, open industry standards body providing management of the Payment Card Industry Data Security Standard (PCI DSS), PCI PIN Entry Device (PED) Security Requirements and the Payment Application Data Security Standard (PA-DSS), today expanded its PIN Entry Device Security Requirements program to cover two new types of devices; unattended payment terminals (UPTs) and hardware security modules (HSMs).
Unattended payment terminals are an increasingly popular form of conducting payment transactions and are used in a variety of scenarios such as museum and concert ticketing, kiosks, automated fuel dispensers and car parking facilities. Hardware security modules are non-user facing devices used in PIN translation, payment card personalization, data protection and e-commerce.
Both UPT and HSM hardware devices can now undergo a rigorous testing and approval process by Council labs to ensure they comply with the industry standards for securing sensitive cardholder account data at all points in the transaction process. The evaluation process includes the logical and physical security of each product. The Council will also provide a list of approved devices on its website, provide documentation and training for labs evaluating these devices and be the single source of information for device vendors and their customers.
“The Council advocates a multi layered approach to security, based on PCI Standards,” said Bob Russo, general manager, PCI Security Standards Council. “The evolution of our PED Security Requirements Program incorporates a comprehensive testing process for UPTs and HSMs so that all components of these devices will now be tested. We are addressing the industry need among vendors and merchants to protect cardholder data in all point-of-sale environments.”
For More Information: Further details on the PCI Security Standards Council’s PED program can be found here: https://www.pcisecuritystandards.org/pdfs/PCI_PED_General_FAQs.pdf
The new security requirements and evaluation vendor questionnaires can be found on the PCI SSC website here: https://www.pcisecuritystandards.org/security_standards/ped/index.shtml
Visa opens PCI Dubai Conference
Dubai, UAE 14 April 2009: Visa International, the leading payment solutions provider, has participated in PCI Dubai, the leading payment industry conference and addressed stakeholders from across the GCC payment industry on various issues surrounding data security and payment card fraud. Participants also shared best practices, emerging technologies, and discussed ongoing industry challenges during the daylong event.
Kamran Siddiqi, MENA General Manager, Visa, delivered the keynote speech at the conference which promotes the widespread adoption of the Payment Card Industry Data Security Standard PCI DSS. In his speech, Mr Siddiqi focused on data protection and fraud prevention through common data security policies and practices.
“Visa has been at the forefront of the issue of data security and payment card fraud, initiating its Account Information Security AIS programme in 2000. The programme, which originally concentrated on the importance of data storage, was the precursor and foundation for the PCI DSS,” said Kamran Siddiqi, General Manager, Visa Inc.
In addition to the conference, sessions, educational seminars, and interactive workshops provided exchanges between participants allowing for a greater understanding of the issues around payment card data vulnerability and the benefits of the PCI DSS.
“Gartner, the information technology research and advisory firm, estimates that 20 – 30 % of Global 1000 companies suffer exposure resulting from privacy mismanagement. The costs to recover from these security mistakes often range from $5 – 20 million for each organisation. Whether data loss is occurring accidentally by businesses or illegally through theft by individuals, the loss of data and the resulting fraud are largely preventable concluded Siddiqi.
Payment card industry compliance is confusing for many ecommerce merchants. But it potentially affects every merchant that accepts credit cards payments. Failure to understand the PCI compliance standards could result in higher merchant account fees and fines from the credit card issuers.
Merchants oftentimes have similar general questions on PCI compliance. We posed some of them to Tim Erlin, principal product manager for nCircle, a security consulting and compliance firm that offers PCI-related services, among other compliance services. Those questions, and his answers, are below.
What is PCI?
Erlin: “PCI generally refers to the Payment Card Industry Data Security Standard, or the PCI DSS. This standard was developed by the PCI Security Standards Council, which is a consortium of the major credit card brands (Visa, Mastercard, American Express, and Discover). It represents the combination of two previous separate programs: the Visa Cardholder Information Security Program (CISP) and MasterCard’s Site Data Protection program (SDP). The goal of the PCI DSS is to specify a common standard for protecting cardholder data from compromise.”
How does PCI compliance affect my ecommerce business?
Erlin: “If you accept credit cards as a form of payment, you are required to be compliant with the PCI DSS. In most cases, smaller merchants can achieve compliance by using compliant shopping carts and payment gateway services. If, however, you choose to collect and store credit card data as part of your business, you’ll need to carefully consider the requirements of the PCI DSS.”
“Larger volume merchants (more than 20,000 credit card transactions annually) will need to complete some specific validation requirements to demonstrate compliance with the PCI DSS. The requirements range from filling out a self-assessment questionnaire to an onsite audit from a qualified auditor. You can find out more details about merchant levels here.”
Where can I learn more about PCI?
Erlin: “The PCI Security Standards Council is the authoritative source for information. You can find their website at http://www.pcisecuritystandards.org. You can also look to the card brands themselves for additional information.”
My annual sales are very small. Do I still have to comply with PCI?
Erlin: “Every merchant that accepts credit cards must comply with PCI, but smaller merchants often achieve compliance by using compliant services. If you don’t store, transmit or process any credit card data, then your systems are out of scope for PCI DSS compliance.”
How do I know if my ecommerce business is PCI compliant?
Erlin: “Do you store, transmit or process credit card data? If the answer is yes, then you are required to fill out a self-assessment questionnaire to demonstrate PCI compliance. You may be required to perform other work to demonstrate compliance depending on your merchant level.”
“If you do not store, transmit or process credit card data, but do accept credit cards through a payment gateway or merchant account provider, then you should validate whether your providers are PCI compliant.”
What happens if my business is not PCI compliant?
Erlin: “If your business is not PCI compliant there are various measures that the card brands can take, ranging from warnings and monetary fines to revoking your ability to process transactions entirely. More importantly, the PCI DSS allows you to assure your customers that you’re protecting their credit card data appropriately.”
If my business is PCI compliant, does it reduce my insurance liability?
Erlin: “Generally, no. If you’re not compliant and experience a breach, however, you can be open to legal action from the affected customers.”
Will PCI compliance reduce my business’s merchant account fees?
Erlin: “This isn’t generally the case. In fact, it can increase the cost. Merchant account providers have to demonstrate their own PCI compliance, and they can and have passed that cost onto their customers.”
Where can I find a list of shopping carts and hosts that are PCI compliant?
Erlin: “Unfortunately, there is no single list of compliant shopping carts, hosts or other providers. However, because PCI compliance is a basic requirement for accepting credit card payments, all of the most common hosted shopping carts are PCI compliant. Choose the shopping cart that has the features and functions you need, then validate that their service is PCI compliant.”
PCI Council gives helping hand to merchants
Prioritized Approach framework to help attain PCI DSS compliance
Ian Williams, vnunet.com 04 Mar 2009
The Payment Card Industry Security Standards Council (PCI SSC) has released a new resource designed to help merchants struggling to attain compliance with the PCI Data Security Standard.
The global payment industry body launched the Prioritized Approach framework to help merchants that are not yet fully compliant. It will identify highest risk targets, create a common language around PCI DSS implementation efforts, and demonstrate progress on the compliance process to key stakeholders.
The framework is made up of six ‘security milestones’ aimed at laying out a series of best practices for protecting against the highest risk factors and escalating threats facing cardholder data security. The milestones are as follows:
1. If you don’t need it, don’t store it
2. Secure the perimeter
3. Secure applications
4. Monitor and control access to your systems
5. Protect stored cardholder data
6. Finalise remaining compliance efforts, and ensure all controls are in place
“Securing cardholder data is the ultimate priority, and following the PCI DSS is the best way to achieve this,” said Bob Russo, general manager of the PCI Security Standards Council.
“The Prioritized Approach framework will help stakeholders understand where they can act to reduce risk earlier in their journey towards PCI DSS compliance.
“The launch of these new guidance and interactive documents are another step by the Council to increase understanding of and education around PCI DSS among merchants, providing them with insight into how they can protect card holder data faster and demonstrate progress and compliance with the PCI DSS.”
According to the PCI SSC, the framework was based on actual data compromises, as well as feedback from assessors and forensic investigators, and input from the PCI SSC Board of Advisors.
The Prioritized Approach framework is available on the Council’s web site. It includes a reference document and downloadable worksheet that allows merchants to sort specific PCI DSS requirements by the individual milestones.
It’s a gross oversimplification of an utterly staggering technical and social challenge, and he knows it as well as anyone, but it’s hard to argue with PCI Security Standards Council General Manager Bob Russo’s assertion that when it comes to improving electronic data security and related matters of individual privacy, “something is much better than nothing.”
Since the massive, potentially record-breaking security breach at Heartland Data Systems in late January, the Payment Card Industry Security Standards Council and its DSS Data Security Standard have been put under a microscope and criticized for foisting on companies an impractical IT security mandate that detractors say does not actually meet its goal of making it harder for companies that handle credit and debit card data to be fleeced similarly to Heartland.
Some highly respected security researchers and practitioners have come out since the Heartland robbery and questioned the viability of the entire DSS effort, perceived as being out of touch with real-world IT environments and insufficient to help organizations avoid exploitation. A handful have gone as far as saying it actually makes the process even harder.
And after all, here’s a Tier 1 company that’s likely had to push to abide by the technological and process-oriented stipulations required under the PCI Standard as much and as long as any other, and it just got positively hammered.
However, visiting Boston on a media tour organized to share some new elements of the PCI Council’s larger plans the week of Feb. 23, Russo and new PCI Security Standards Council Chairman Lib de Veyra — an executive at and appointee of JCB International Credit Card — made a lot of credible points. Mostly, because they firmly recognized the reality that no standard is perfect and that DSS as it exists is only a first step in a long evolutionary process.
Not to be misinterpreted, the PCI Council is satisfied with what it’s put in place thus far, given the challenge at hand, Russo and de Veyra said.
The parts of DSS that need to be tweaked to address the vast diversity of infrastructure and applications employed by all the retailers, merchants and processors, as well as all the techniques utilized by attackers, will be addressed by taking feedback directly from the very companies that must comply with the standard, the PCI Council representatives said. And truthfully that has been at the very least a consistent message of the organization all along.
A number of powerful banking, retail, technology and government players are also involved in the PCI Advisory Board.
And the Heartland incident, as well as those reported at other companies that have been at some time certified as PCI compliant, including TJX Companies and Hannaford Brothers, in no way proves that the standard is clearly lacking in some specific area, they said.
The PCI leaders said in addition to having not yet shared specific details with the Council of exactly how they were individually victimized by fraudsters, the fact that these companies were at one time judged to be in conformity with DSS in no way guarantees that they were at the time they were attacked.
“Just because a company gets a clean bill of health today doesn’t mean they can’t be infected tomorrow,” de Veyra said. “Organizations are making configuration changes and broadening adoption of technologies like wireless all the time; the guidelines in DSS are something that you have to continue to monitor and maintain all the time.”
And many of the Council’s initiatives, including plans to launch two new standards aimed at improving embedded security features, or “host security modules,” built into card data transaction processing hardware, and regulations for UPTs (unattended payment terminals) such as gas pumps and ticketing kiosks, will help push the entire industrywide process forward, they said.
The PCI Security Standards Council will also continue to push DSS overseas, in Europe and APAC specifically, where the guideline has faced some resistance from card handlers. But the effort launched by the world’s largest card companies — American Express, Discover, JCB, MasterCard and VISA – remains undaunted in its pursuit, PCI’s chief spokespeople said.
“Addressing the criticism comes down to communication; once we have enough information from companies like Heartland to truly examine what happened, we can understand how it relates to DSS,” de Veyra said. “And working with all the companies on our Advisory Board, meeting with them and incorporating their feedback over time, will be the most important aspect of maturing the standards.”
Another new element of DSS will be a technological tool, a sort of stripped-down PCI diagnostic application provided by the Council to offer organizations still getting started with the standard a more “prioritized approach to DSS.”
The Prioritized Approach tool will help companies track their ability to meet basic milestones of achieving compliance with DSS, the representatives said. The first three steps — preventing the improper storage of electronic data, securing the network perimeter and securing applications — have obviously been proven hard to accomplish for many organizations, and some might argue most or even all.
But most importantly, the idea is to promote gradual coalescence of a world where every company affected by the PCI mandate has at least greatly augmented and formalized its approach to, if not its execution of, securing electronic data, the leaders said.
“No standard is ever going to completely stop what we’re seeing right now with cyber-crime, but the reaction we’ve seen to PCI after some of these incidents like Heartland has been absolutely unfair, because we don’t even know if they were compliant,” Russo said.
In terms of whether incidents like the breaches at Heartland, TJX and Hannaford Brothers have damaged public perceptions of DSS, the industry veteran said, as in any case, there is no shortage of opinions.
“You can sit there and look at it from one side and say, you have this standard but these incidents have still happened, and that proves something isn’t working,” Russo said. “But what you don’t know at the same time is, If we didn’t have DSS as it stands in place, how many more of these incidents might we have had?”
I’m sure that there are valid criticisms of various aspects of PCI — some very smart people have spent time voicing their questions already.
But, I’m curious to know whether they’d agree at the end of the day that something is better than nothing.
Matt Hines has been following the IT industry for over a decade as a reporter and blogger, and has been specifically focused on the security space since 2003, including a previous stint writing for eWEEK and contributing to the Security Watch blog. Hines is currently employed as marketing communications manager at Core Security Technologies, a Boston-based maker of security testing software. The views expressed herein do not necessarily represent the views of Core Security, and neither the company, nor its products and services will be actively discussed in the blog. Please send news, research or tips to SecurityWatchBlog@gmail.com.
The Death Of PCI DSS? Don’t Be Silly
Posted by George Hulme, Jan 27, 2009 10:13 PM
Yes, in the past year two big retailers, who were apparently compliant to the Payment Card Industry Data Security Standard, were breached. Does that mean PCI DSS has grown increasingly irrelevant? That’s absurd.
First, I’m going to state the obvious to anyone who has studied IT security and compliance: being compliant to any mandate won’t make one secure. And it never will. I can’t make the case as to why any better than Mike Fratto did in his earlier post, so I won’t even try. But the point is: focus on building a secure and sustainable infrastructure, no matter how big, or small.
I’ve spent the better part of this decade interviewing IT security experts, vendors, chief information security officers, and other security mangers in just about every industry. Generally — and I stress *generally* (as there are always exceptions) — retailers, manufacturers, and heath care providers have tended to have the least mature security programs in place. Not true in every case, but I found it to be true often enough to see a trend.
Prior to the adoption of PCI DSS several years ago, online retailers and brick-and-mortar retailers barely paid attention to IT security. Trust me, in 2003, when it came to a Top 10 list of the most pressing IT objectives for most merchants, IT security ranked around 580th. They probably spent more time evaluating stationery than how to secure their databases, Web applications, and VPNs. So when you see retail IT managers arguing over whether they should install a Web Application Firewall, or conduct application security vulnerability assessments, or even do both, you can thank PCI DSS to a large degree.
PCI DSS has done more to raise security awareness among retailers than anything else I can think of. Even the torrent of breaches earlier this decade. And, while I can’t quantitatively prove it, PCI DSS has most certainly raised the security of the retail industry, in general.
The unfortunate breaches of Hannaford Bros. Co, and more recently Heartland Payment Systems, which were both PCI compliant, doesn’t make PCI DSS either irrelevant, obsolete, or worthless. Whoever thought PCI DSS would eliminate data breaches in retailers should probably make sure they don’t work in IT, and definitely make sure they don’t work in IT security. Its goal was to raise security among retailers and merchants, and to a large degree its been a success.
This standard isn’t perfect, not by a long shot. The standard won’t eliminate security breaches — and no one said it would. But the standard has increased the security of many retailers, and probably stopped quite a few breaches along the way.
Do we talk about the “failure” of law enforcement when someone commits a crime? Do we talk of the irrelevance of physical security when banks are robbed? Do we talk about how “worthless” the military is after a lost battle or two? Do we talk about how eating healthily and exercising is such a waste when falling ill?
No, intelligent people do not do those things.
The battle against cybercrime is like any other long-term fight, and once in awhile companies that strive to do everything right are going to find themselves breached. It’s the nature of this beast, not the fault of PCI DSS.
The high cost of finding and patching application flaws is well known. Wouldn’t it be cheaper to write secure code in the first place?
One of the fastest growing areas in the software security industry is source code analysis tools, also known as static analysis tools. These tools review source code (or in Veracode’s case, binary code) line by line to detect security vulnerabilities and provide advice on how to remediate problems they find – ideally before the code goes into production. (See How to Evaluate and Use Web Application Scanners for tools and services that search for vulnerabilities as an outside attacker might.)
The entire software security market was worth about $300 million in 2007, according to Gary McGraw, CTO at Cigital, Inc., a software security and quality consulting firm in Dulles, VA. McGraw estimates that the tools portion of that market doubled from 2006 to 2007 to about $180 million. About half of that is attributable to static analysis tools, which amounted to about $91.9 million, he says.
And no wonder; according to Gartner, close to 90% of software attacks are aimed at the application layer. If security were integrated earlier in the software development lifecycle, flaws would be uncovered earlier, reducing costs and increasing efficiency compared with removing defects later through patches or never finding them at all, says Diane Kelley, founder of Security Curve, a security consultancy in Amherst, N.H. “Although there is no replacement for security-aware design and a methodical approach to creating more secure applications, code-scanning tools are a very useful addition to the process,” she says.
Despite the high degree of awareness, many companies are behind the curve in their use of static analysis tools, Kelley says, possibly due to the big process changes that these tools entail.
Key decisions in source code analysis
1) Should you start with static tools or dynamic tools or use both?
In addition to static analysis, which reviews code before it goes live, there are also dynamic analysis tools, which conduct automated scans of production Web applications to unearth vulnerabilities. In other words, dynamic tools test from the outside in, while static tools test from the inside out, says Neil McDonald, an analyst at Gartner.
Many organizations start with dynamic testing, just to get a quick assessment of where their applications stand, McDonald says. In some cases, the groups that start this initiative are in security or audit compliance departments and don’t have access to source code. The natural second step is to follow up with static analyzers, enabling developers to fix the problems found by dynamic analysis tools. Some companies continue using both, because each type yields different findings.
An important differentiator between the two types is that static analyzers give you the exact line of code causing the problem, while dynamic analyzers just identify the Web page or URL causing the issue. That’s why some vendors offer integration between the two types of tools.
According to the chief scientist at a large software vendor, dynamic assessment tools “tend to be brute force,” he says. “You have to hit every parameter to find the vulnerabilities, whereas static tools investigate the whole landscape of the application.” He recently chose a code scanner from Ounce Labs, after outsourcing the work to Cigital since 2006. He became interested in application security whencustomers began requiring PCI DSS certification. He plans to add in dynamic testing in the future, but the static analysis tool is the cornerstone of his application security program.
2) Do you have the source code?
Most static analyzers scan source code, but what happens if you want to analyze third-party software or code written so long ago that you only have the executable? In that case, Veracode, Inc. offers binary code scanning through a software as a service platform. “A vendor may not be willing to give you source code, but they will give you executables or binary,” Kelley says.
At the Federal Aviation Administration, Michael Brown, director of the Office of Information Systems Security, says he chose to use Veracode’s services this year because of the amount of vendor-written code the FAA anticipated to use as a result of its modernization of the national airspace system. Brown says he wanted to ensure the code was not only functionally correct and of high quality but also secure. He wanted a service rather than a tool to reduce the need for training. So far, the results have been eye-opening, he says. “A lot of the code didn’t really take security into account,” he says. “There were cases of memory leaks, cross-site scripting and buffer overflows that could have been a cause for concern.”
3) What do you currently use for software quality?
Some tool vendors, such as Coverity, Inc., Klocwork, Inc., Parasoft Corp. and Compuware Corp., originated in the quality-testing arena and have added security capabilities vs. vendors like Ounce and Fortify Software, Inc., which were solely designed for security. It’s worthwhile to check into the quality tools you already use to see if you can leverage the existing relationship and tool familiarity. You should also consider whether it’s important to your organization to have the two functions merged into one tool in the long term, McDonald says. (See Penetration Testing: Dead in 2009? for more on the interplay of quality assurance and vulnerability detection.)
Source code analysis tools: Evaluation criteria
— Support for the programming languages you use. Some companies support mobile devices, while others concentrate on enterprise languages like Java, .Net, C, C++ and even Cobol.
— Good bug-finding performance, using a proof of concept assessment. Hint: Use an older build of code you had issues with and see how well the product catches bugs you had to find manually. Look for both thoroughness and accuracy. Fewer false positives means less manual work.
— Internal knowledge bases that provide descriptions of vulnerabilities and remediation information. Test for easy access and cross-referencing to discovered findings.
— Tight integration with your development platforms. Long-term, you’ll likely want developers to incorporate security analysis into their daily routines.
— A robust finding-suppression mechanism to prevent false positives from reoccurring once you’ve verified them as a non-issue.
— Ability to easily define additional rules so the tool can enforce internal coding policies.
— A centralized reporting component if you have a large team of developers and managers who want access to findings, trending and overview reporting.
Do’s and Don’ts of source code analysis
DON’T underestimate adoption time required. Most static analysis projects are initiated by security or compliance, not developers, who may not immediately embrace these tools. Before developers get involved, McDonald suggests doing the legwork on new processes; planning integration with other workflows like bug-tracking systems and development environments; and tuning the tool to your unique coding needs. “Don’t deploy to every developer at once,” he adds. “Ideally, you’ll get someone who wants to take on a competency role for security testing.”
The chief scientist at the large software vendor has developed an application security awareness program that includes training on common vulnerabilities, through podcasts and videocasts. Once he builds up awareness, he’ll educate developers on secure coding standards. To complete the circle, he’ll introduce Ounce’s static code analysis tool to enforce the standards and catch vulnerabilities “so it’s a feedback loop,” he says. (See Rob Cheyne Pushes for Developer Security Awareness for a look at a similar agenda.
DO consider using more than one tool. Collin Park, senior engineer at NetApp, says the company uses two code analysis tools: Developers run Lint on their desktops, and the company uses Coverity each night to scan all completed code. “They catch different things,” he explains. NetApp began using these tools when its customer base shifted to enterprise customers who had more stringent requirements. While Coverity is better at spotting vulnerabilities such as memory leaks, LINT catches careless coding errors that developers make and seems to run faster on developer desktops, Park says.
According to Kelley, organizations typically implement static analyzers at two stages of the development process: within the development environment, so developers can check their own code as they’re writing, and within the code repository, so it can be analyzed at check-in time. The chief scientist uses this method. “In the first scan, if the engineer takes every finding and suppresses them, a milestone scan will catch those and generate a report,” he says.
DO analyze pricing. Vendors have different pricing strategies, McDonald says. For instance, while all continuously add information to their libraries about the latest vulnerabilities, some charge extra for this, while others include it in the maintenance fee, he says. In addition, some vendors charge per seat, which can get expensive for large shops and may even seem wasteful for companies that don’t intend to run the scanner every day, while others charge per enterprise license. Additionally, some vendors charge for additional languages, while others charge one price for any language they support, McDonald says.
DO plan to amend your processes. Tools are no replacement for strong processes that ensure application security from the beginning, starting with defining requirements, which should focus on security as much as functionality, according to Kelley. For instance, a tool won’t tell you whether a piece of data should be encrypted to meet PCI compliance. “If a company just goes out and buys one of these tools and continues to do everything else the same, they won’t get to the next level,” she says.
The chief scientist says it’s also important to determine what will happen when vulnerabilities are found, especially because the tools can generate thousands of findings. “Does the workflow allow them to effectively analyze, triage, prioritize or dispose of the findings?” he says. He is working with Ounce to integrate the system better with his current bug-tracking system, which is Quality Center. “It would be great to right-click on the finding to automatically inject it into the bug-tracking system,” he says.
At NetApp, Park has reworked existing processes to ensure developers fix flagged vulnerabilities. As part of doing a code submit, developers do a test build, which must succeed or it can’t be checked in. Then, when they check in code, an automated process starts an incremental build. If that build fails, a bug report is filed, complete with the names of developers who checked in code before the last build. “Developers are trained to treat a build failure as something they have to look at now,'” Park says.
NetApp also created a Web-based chart that’s automatically updated each night, to track which managers have teams that were issued Lint or Coverity warnings and whether they were cleared.
DO retain the human element. While the tools will provide long lists of vulnerabilities, it takes a skilled professional to interpret and prioritize the results. “Companies don’t have time to fix every problem, and they may not need to,” Kelley says. “You have to have someone who understands what is and is not acceptable, especially in terms of a time vs. perfection tradeoff.'”
The chief scientist calls this “truly an art form” that requires a competent security engineer. “When the tool gives you 10,000 findings, you don’t want someone trying to fix all those,” he says. “In fact, 10,000 may turn out to just be 500 or 100 vulnerabilities in actual fact.”
Park points out an instance where the Coverity tool once found what it called “a likely infinite loop.” On first glance, the developer could see there was no loop, but after a few more minutes of review, he detected something else wrong with the code. “The fact that you get the tool to stop complaining is not an indication you’ve fixed anything,” Park says.
DON’T anticipate a short scan. NetApp runs scans each night, and because it needs to cover thousands of files and millions of lines of code, it takes roughly 10 hours to complete a code review. The rule of thumb, according to Coverity, is for each hour of build time, allow for two hours for the analysis to be complete. Coverity also enables companies to do incremental runs so that you’re not scanning the entire code base but just what you’ve changed in the nightly build.
DO consider reporting flexibility. At the FAA, Brown gets two reports: an executive summary that provides a high-level view of vulnerabilities detected and even provides a security “score,” and a more detailed report that pinpoints which line of code looks troublesome and the vulnerability that was detected. In the future, Brown would like to build into vendor contracts the requirement that they meet a certain security score for all code they develop for the FAA.
DON’T forget the business case. When Brown first wanted to start reviewing code, he met with some pushback from managers who wanted a defined business need. “You’ve got program managers with schedules to meet, and they can view this as just another bump in the road that’s going to keep them from making their milestones,” he says.
Brown created the business case by looking to independent sources like Gartner and Burton Group for facts and figures about code vulnerability, and he also ran some reports on how much time the FAA was dedicating to patch management.
The chief scientist justified the cost of the Ounce tool by taking the total cost of the product and comparing that to the effort involved in a manual review. “With millions of lines of code, imagine how many engineers it would take to do that, and by the way, we want to do it every week,” he says. “The engineers would fall down dead of boredom.”
Mary Brandel is a freelance writer based outside of Boston.