What Good is PCI-DSS?

May 2, 2012 1 comment

With the most recent high-profile credit card data breach occurring late last month at Global Payments, one has to question the real benefit of PCI-DSS.  After all, didn’t a nationally-recognized Qualified Security Assessor (QSA) confirm their compliance with PCI-DSS?  If so, how is it that the company still had a breach?

There are very few details on exactly what happened at Global Payments.  One rumor has the breach occurring through a taxi company in New York.  Another rumor states the breach involved answering a series of knowledge-based security questions correctly.  The truth is, Global Payments may never know exactly what led to the breach.

Once the breach became public, VISA removed Global Payments from its list of “approved” card processors.  VISA indicated the company can be reinstated after an independent assessment of compliance with industry standards.  If we read between the lines,  VISA is essentially saying that since Global Payments had a breach, they must not have been in compliance with PCI-DSS standards at the time of the breach.  So where does that leave us regarding PCI compliance?  Basically the same place that any compliance review leaves you.  Just because an organization is compliant with a given standard does not mean that bad things won’t happen.

Credit card processors have some very valuable information that bad guys all over the world would love to get their hands on.  They are the Fort Knox of the modern world.  When bad guys are motivated, it seems no amount of security can keep them out. Does that mean PCI-DSS standards are worthless?  Not at all.  It just means it isn’t foolproof.  Especially not in today’s world of spear phishing, trojans and highly coordinated social engineering attacks.  When you have good locks on your data, the bad guys will simply begin targeting those within the organization that have the keys.

No matter how much technology you throw at security, people will always be the weakest link. The PCI-DSS standard (and many others) doesn’t do a very good job of evaluating how well we train our people to recognize social engineering and spear phishing.   As evidence, look at the facts behind the breaches at RSA, Epsilon, and HBGary.  Each of those breaches involved a failure of humans to recognize that they were being enticed to hand over the keys.  If we ever do get any details about this latest breach at Global Payments, I’m betting there was a component of human failure. It can be difficult to recognize the wolf in sheep’s clothing when they are asking for the keys.

PCI-DSS compliance is primarily about setting up and maintaining technology to protect credit card data.  With the exception of Requirement 12, the PCI-DSS criteria are predominantly about security technology such as firewalls, intrusion detection, encryption, IDs and passwords, and the like.  Requirement 12:  “Maintain a policy that addresses information security for all personnel.  A strong security policy sets the security tone for the whole entity and informs personnel what is expected of them. All personnel should be aware of the sensitivity of data and their responsibilities for protecting it.” That description does not address the need for a rigorous training program for the human factor.  Are all your employees equally capable of recognizing a spear phishing email?  Are they trained how to recognize a telephone-based social engineering exploit?  Are they absolutely clear on what information is secret, classified, and public?  Without regular ongoing training the human factor will continue to be the weakest link in our security and the bad guys will continue to exploit that weakness.

So what’s the answer?  First, we have to do a better job with education and training.  The SANS Institute has developed a two-day course devoted to “Securing the Human“.  The course is Management 433.  The intent is to develop and strengthen the human side of the security equation through an effective security awareness program; one that will change behaviors of employees and give them more tools to recognize the wolf in sheep’s clothing.

Second, we need to work to improve our standards to include the human factor.  That will take time and effort on everyone’s part, but especially those at the PCI Security Standards Council and other standards organizations.  What have you done recently within your organization to strengthen the “human factor”?

SSAE 16 “First to Fail”?

December 27, 2011 7 comments

I’m still waiting for a service organization to write a press release that is:

  1. accurate
  2. replete of the word “certification
  3. shows a moderate level of understanding about SOC attestations
  4. announces that the service organization conducted the right SOC attestation

This morning I was greeted with a press release from First To File ®, announcing that they have “passed” their SSAE 16 audit “for the third year in a row”.   Hmmm.  Considering the SSAE 16 standard wasn’t released until 2010 that’s a pretty neat trick!  But that isn’t really why I’m writing about this press release.  And I really am not trying to pick on First To File ®.  Their press release just happens to contain many of the issues I have been trying to address with this blog.  Apologies in advance.

It appears to me based on the description of  First to File’s® business (patent prosecution support and document management service) that the SOC 1 audit was probably not the right type of SOC review for them to undertake in the first place.  One of the primary reasons that the AICPA decided to do away with SAS 70 and create the SOC standards was because SAS 70 was being misused.

The AICPA white paper describing the new SOC standards says it best: “As organizations became increasingly concerned about risks beyond financial reporting, SAS 70 often was misused as a means to obtain assurance regarding compliance and operations.” 1  SOC 1 reports focus “solely on controls at a service organization that are likely to be relevant to an audit of a user entity’s financial statements.” 2

So if First to File® is in the business of document management, how do their services have any relevance to a user entity’s financial statements?  They are merely storing intellectual property (IP) in a web-based environment for their customers.  The only impact to the financial statements of their customers would be the fees paid by the customer for the services rendered.  You might even stretch things and conclude that the value of the IP is at risk since it is being stored and protected by a third party.  But that still does not justify the use of a SOC 1 (SSAE 16) report.

Certainly their customers would be interested in knowing what types of controls over the security and confidentiality of that intellectual property First to File® has in place.  This is precisely the scenario that the AICPA created the SOC 2 report for.  It is intended for situations where a report is needed about controls at a service organization intended to mitigate risks related to security, availability, processing integrity, confidentiality, or privacy.  Of these, it appears to me at first glance that customers of a company providing document management services would certainly be interested in controls around security, confidentiality, and privacy.  Perhaps even availability since it would be important to know that the web-based services would be available when needed.

So why would First to File® decide to ask their auditor for an SSAE 16 report?  Because the AICPA and many CPA firms have not sufficiently educated the marketplace regarding the intent and appropriateness of SOC 1 vs SOC 2 vs SOC 3.    Which is why I felt compelled to share this blog.

I can’t really blame the marketing and public relations folks that drafted the First to File® press release.  If CPAs and other controls experts can’t figure out the new standards, we shouldn’t expect marketing folks to get it.  If anyone is at fault, it would be the CPA firm that undertook the engagement.  They should have done a better job of explaining the options and steered the customer away from SOC 1 and toward SOC 2.  If after thoroughly understanding the options, the company still elected to have a SOC 1 (SSAE 16) report prepared, then all we can say is “the customer is always right“.

1  Service Organization Controls: Managing Risks by Obtaining a Service Auditor’s Report – AICPA, Nov 2010

 2 ibid

 

SOC 2 is NOT SSAE 16

December 21, 2011 2 comments

I just saw the following link related to a data center audit:

Cbeyond One of First SSAE 16 Certified Cloud Companies

Just when I thought things were getting better, along comes this press release that is wrong on so many levels I don’t even know where to begin….. but I’ll try.

First off, SSAE 16 is NOT a certification as I have pointed out MANY times.  (see Just as I Predicted…)  Secondly, SOC 2 is totally unrelated to SSAE 16.  Statement on Standards for Attestation Engagements (SSAE) 16 is specific guidance to CPA firms for planning and conducting Service Organization Control (SOC) 1 reviews. Those are reviews intended for controls at service organizations likely to be relevant to user entities’ internal control over financial reporting.

So far, the AICPA has not released any specific SSAE for SOC 2.  There is an official “guide” to conducting a SOC 2 engagement, but there is not a specific Statement on Standards for Attestation Engagements (SSAE).

The following paragraph highlights the rampant confusion that exists in the marketplace regarding the new AICPA standards for Service Organization audits that replaced the old SAS 70 standard:

“Considered the second-generation data center audit standard, SSAE 16 SOC 2 reviews evaluate the design and operational effectiveness of a center’s controls against a strict series of international standards. Earning SSAE 16 certification demonstrates that Cbeyond Cloud Services is fully compliant with all necessary security and privacy specifications, and demonstrates that its customers are served and hosted in a highly secure, controlled facility.”

Neither SSAE 16 (SOC 1) or SOC 2 is a “data center audit standard”.  And the SOC 2 criteria are NOT an “international standard”.

It is difficult to tell from this press release exactly what Cbeyond did since the press release is mixing SSAE 16 (SOC 1) and SOC 2 together.  Claiming “certification” is just more of the same ignorance that most of the industry shares.

If you are writing or reading press releases from data centers and cloud providers as a normal part of your day, please take the time to understand the new standards and what they mean.  Press releases like this one do nothing to clear the confusion created by the new SOC standards.  If you have questions about the standards, please speak to a qualified member of a CPA firm in order to ensure you are writing and reading with a full understanding.

Would You Open This File?

August 26, 2011 Leave a comment

The screen shot below is purportedly an image of the actual email used to launch the zero day attack against RSA, thereby compromising their SecurID algorithm, allowing the bad guys to get into US defense contractors.

screenshot of RSA email

The full story of how the file was eventually found within RSA’s email and malware repository can be viewed here.

Would you open the attachment? Those of us in the security and controls world would probably have picked up on the poor grammar, the unexpected nature and title of the attachment and figured out that this was some kind of attempted compromise and just deleted the email.  Others would likely have assumed that the email was intended for someone else and opened it out of curiosity, thereby giving the bad guys a way in.

I’m putting it out here because it drives home the point that when combating these kinds of attacks, the only real hope you have is the vigilance and knowledge of your own employees in being able to recognize the bogus emails and attachments.  That can only be effective if you spend the time and the money required to train your employees and train them often.

It takes a certain level of cynicism and suspicion to recognize these kinds of attacks.  Many people just don’t have enough of those two qualities and as a result, will open every email and every attachment sent to them.  The bad guys keep doing it because it works.

The author of the full article raises an interesting question: “Why does Excel support embedded Flash?”  Are there really that many Excel users that thought embedded Flash was a “must have” feature?  Was there anyone from the security team in the meeting where they decided it was a good idea?  What could possibly go wrong?  It’s just Flash?

Hopefully the malware providers are already working on an option to identify Excel spreadsheets with embedded Flash and issue a stern warning or prevent infection.   In the meantime, we have to rely on good old fashioned training to try and combat the compromise of our company systems.

 

 

Categories: Uncategorized

Why Data Centers don’t need SAS 70 or SSAE 16

August 23, 2011 2 comments

Most large Data Center and Colocation providers that have Fortune 500 customers have been providing SAS 70 reports for several years.  Now that SSAE 16 has been announced as the “replacement” for SAS 70, most all of them are undergoing SSAE 16 reviews.   You can check my prior blogs SAS 70 is Dead and SSAE 16 is the New SAS 70 for more on the history of how we got here and the fact that “the customer is always right”.

I am continually amazed by how adamant many auditors and IT controls people are about why a data center or colocation provider needs an SSAE 16 audit.  I agree that DCs provide certain fundamental general controls that may impact the systems that are maintained there.  But even those general controls do not constitute Internal Controls over Financial Reporting (ICFR) which is clearly a requirement for performing a SOC 1 (SSAE 16) review.  With few exceptions, DCs and colocation centers do not WANT to be able to alter the processing of their customer’s transactions and do everything in their power to avoid direct access with their systems.

So what exactly is ICFR?  The SEC are the overlords of Sarbanes-Oxley compliance and the purveyors of wisdom regarding ICFR.  They define ICFR as:

“A process designed by, or under the supervision of, the registrant’s principal executive and principal financial officers, or persons performing similar functions, and effected by the registrant’s board of directors, management and other personnel, to provide reasonable assurance regarding the reliability of financial reporting and the preparation of financial statements for external purposes in accordance with generally accepted accounting principles and includes those policies and procedures that:

(1) Pertain to the maintenance of records that in reasonable detail accurately and fairly reflect the transactions and dispositions of the assets of the registrant;

(2) Provide reasonable assurance that transactions are recorded as necessary to permit preparation of financial statements in accordance with generally accepted accounting principles, and that receipts and expenditures of the registrant are being made only in accordance with authorizations of management and directors of the registrant; and

(3) Provide reasonable assurance regarding prevention or timely detection of unauthorized acquisition, use or disposition of the registrant’s assets that could have a material effect on the financial statements.

Now that’s a very long definition but important to understand what ICFR is.  Several key phrases stand out in that definition noting policies and procedures that:

  • “Pertain to maintenance of records” – does a data center maintain records? No. A DC maintains an environment of physical security, environmental controls, and connectivity.  The user organization is responsible for maintaining the records.
  • “Provide reasonable assurance that transactions are recorded as necessary” – Does a DC provide assurance?  No, the user organization does that.
  • “Provide reasonable assurance regarding prevention or timely detection of unauthorized acquisition, use or disposition of … assets” – Does a DC provide this assurance?  No, not unless they are providing managed services in addition to basic data center services.

So where is the link to ICFR?  When examining the types of controls that a typical DC or colo facility provides, there is no relevant link to ICFR.

“So Barton, are you saying that user organizations shouldn’t be concerned with controls at their third party colo or DC?”  Absolutely not.  I am saying that a SOC 1 SSAE 16 report is not the right answer for a colo or DC.  The more appropriate SOC report would be a SOC 2 report.

SAS 70 (and subsequently SSAE 16) were never meant to be a report on IT general controls.  They became popular for DCs and colos because auditors didn’t know of any alternatives to SAS 70 for understanding controls at service organizations.   Now that the AICPA is promoting SOC 2 as an alternative to SAS 70 for understanding IT general controls, there is no reason for a DC or colo to undergo an SSAE 16 review.

For most DCs and colos, the services provided are no different from those provided by the building management companies for large office buildings throughout the country.  Building management companies lease space.  That space includes physical security and environmental controls.  They don’t typically provide connectivity but there may be some that do.  DCs and colos lease space that includes physical security, environmental controls, and connectivity.  Those services do not constitute ICFR under the SEC definition.

So don’t ask your DC or colo provider to give you an SSAE 16 report.  Instead, look into an alternative like the SOC 2 or SOC 3 report to get an understanding of the IT General Controls they provide.

Got Bots?

While I was at lunch today I happened to see a short video on CNN about a company (Unveillance) that provides Security as a Service relative to the identification of compromised computers (bots) in private and corporate networks. For all the time and money spent by organizations to install and update anti-malware on their networks, the real bad guys are using and sharing new techniques to compromise devices without triggering  anti-malware software.

A bot is geek-speak for a software version of a mechanical robot.  A botnet is a collection of compromised computers used for malicious purposes. 1  In most cases the compromised machines are used to collect information such as account numbers, user ids, passwords, etc.  This information is then automatically sent back to the Command & Control (C&C) server which is controlled by a botnet administrator.  They are sometimes used to distribute spam or to execute Distributed Denial of Service (DDoS) attacks against websites.  Because the attacks come from compromised machines it is difficult to trace these attacks to the person actually initiating and controlling the attack.

The CNN piece goes on to discuss the current threat level which Unveillance estimates at 6% of the total IP addresses globally. That translates to an astounding 240 million computers that may be compromised and controlled to some degree by the bad guys.  Since the techniques used to compromise machines are not detected by conventional anti-malware, the owners and operators of compromised machines typically don’t even know the machine has been compromised.  In most cases, the computer acts normally, but is recording keystrokes and copying data undetected in the background.

This news, while disturbing, is consistent with some of the conversations I had last Friday at the Atlanta Technology Summit.  Several security experts that I spoke with at the conference told horror stories of entire corporate networks being “pwned” (owned) by foreign governments or organized crime rings in the far east and eastern Europe.  Multiple attempts to “clean” these corporate networks have been initiated, but ultimately within a short time, the evidence of compromise resurfaces.

After listening to my colleagues at the Atlanta Technology Summit and seeing the CNN piece, I realized that most of the IT risk assessments I have done in recent years did not address this issue.  So what am I going to change as far as my methodology for performing risk assessments?  First, I will ask management what policies and training are in place to help reduce the risk of compromise.  Many of the machines get compromised as a result of a social engineering techniques that rely on a valid user clicking a link to an infected website or execution of a payload program imbedded in an email.  So how do you combat that kind of an attack?  The key is effective and frequent education and training.   You should also constantly look for ways to improve the effectiveness of your spam filtering and website traffic monitoring.

In some cases, awareness of the risk and vulnerability is the first step toward designing and implementing appropriate controls.  Unfortunately I don’t have the cure on this one, but it appears that Unveillance at least has a method for identifying the disease.

1 – Wikipedia – http://en.wikipedia.org/wiki/Botnets – July 2011

SSAE 16 is the new SAS 70? Not So Fast!

July 24, 2011 4 comments

As you know, the AICPA officially retired Statement on Auditing Standard number 70 (SAS 70) as of June 15, 2011. The AICPA has released three new standards for service organization audits. They are known as Service Organization Control (SOC) reports, more specifically SOC 1, SOC 2, and SOC 3. SOC 1 has published guidance, SSAE 16 associated with it. SSAE 16 is the “official replacement” for SAS 70.

Due to the timing  and publicity around SSAE 16, many organizations (and CPA firms for that matter) do not fully understand the differences between the three new SOC reports and when each report is most appropriate.  Instead, they assume that if a SAS 70 report had been issued in the past, then an SSAE 16 report should be issued after June 15, 2011.  Depending on the type of services being provided and the intended use of the report, an SSAE 16 report may not be the most appropriate choice.

Part of the reason the AICPA created the new standards was to attempt to rectify the misuse and abuse of SAS 70 reports.  In years past, many colocation facilities and data centers have been asked by their customers to provide them with a SAS 70 report.  In many cases, these data centers have nothing whatsoever to do with the processing of their customers transactions or the operation of the software they utilize.  They merely provide a physically, environmentally and logically secure environment for the computing equipment owned and operated by their customer.  They have little if anything to do with logical access controls, processing controls, change controls, or reporting of processing results.  They are not unlike the landlord of many office buildings that supply facilities, heating and cooling, and limited building security to businesses every day.

SAS 70 reports were intended to assist financial statement auditors in understanding the controls present at third party service organizations that were relevant to user entities financial statements.  Would it make sense for auditors to request a SAS 70 report from the landlord of the building where their client’s business resides?  No. And it doesn’t make sense for auditors and customers of a colocation facility to ask the colo to provide an SSAE 16 report.  Why?  My primary argument is that the services provided by the colo have very little impact on the customer’s controls over financial reporting.  But before you all go flaming me and bringing up Sarbanes-Oxley and IT General Controls, read my next argument.  My secondary argument is because there is now a better alternative to SSAE 16 and its predecessor SAS 70 for reporting on the ITGCs at a colocation facility.  It is the AICPA SOC2 and SOC3 report.

There are many readers that will disagree with my statement that the services provided by a colo have little impact on controls over financial reporting because Sarbanes-Oxley requires auditors to understand the IT General Controls (including environmental and physical) that support applications and systems that are relevant to financial statements.  I agree that it is important to understand these foundational controls.  However, the controls that matter most in systems that have relevance to financial reporting are not under the control of the data centers and colo providers.  Logical access controls, segregation of duties, program change, operations controls are normally the responsibility and domain of the customer, not the colo.  These controls are far more relevant to the ongoing control over financial reporting.  A locked facility with adequate environmental controls is important but does not have the same impact over financial reporting that good logical access, program change, and operations controls will have.

Wouldn’t it make more sense to have a standard set of ITGC  controls criteria so that every processing environment (i.e. data center, colocation facility, IT “closet”) could be graded on a standard scale?  Wouldn’t that make everyone’s life a little easier and provide better information for the auditors AND the customers?  Yes, it would. And the good news is that the AICPA has given us a working tool that is far superior to the wildly fluctuating quality and coverage of SSAE 16/SAS 70 reports.

SOC 2 and SOC 3 reports are based on the AICPA and CICA Trust Services Criteria.  The Trust Services Criteria are divided into 5 Principles: Security, Availability, Processing Integrity, Confidentiality, and Privacy.  Each of the five Principles contains a list of criteria that support that Principle.  For most data center and colocation service providers, an audit based on the Security and Availability Priniciples would provide ample evidence of the state of ITGCs applicable to whatever systems the customer is operating. Granted, once a service provider moves into the realm of Platform as a Service or Software as a Service, then you will likely have to add Processing Integrity criteria to the mix.  Confidentiality and Privacy could also be part of the scope of an audit based on the types of data being maintained and the industry or compliance requirements of the customer.

I will be the first to admit that the criteria currently published by the AICPA and CICA are not a silver bullet.  My initial impression is that the Principles and criteria are in some cases difficult to interpret and poorly organized.    But they are still a superior alternative to allowing every DC and colo to write their own criteria which is what happens with SAS 70 and SSAE 16 audits.

So before you request an SSAE 16 report from your client or service provider, take the time to understand SOC 2 and SOC 3.  I believe that for most data center and colocation providers, these new report offerings are a better alternative to SSAE 16 reports because there is a pre-defined set of controls criteria that we auditors will be using as a baseline for evaluation.  The best place to begin your education about SOC 2 and SOC 3 is the AICPA website: http://www.aicpa.org/SOC