Wednesday, May 18, 2011

PCI QSA Re-Certification – 2011 Edition

Wednesday, May 17, 2011

It is that time of the year, time for the PCI Guru to take the PCI SSC’s QSA re-certification training and test. As with last year, the process is all online.

The process started this year with our Key Contact person emailing me the invoice for the training. Since the PCI SSC is creating individual invoices for each QSA to be trained, our firm is requiring the invoice to be paid by the QSA and then expensed through the firm’s expense reporting system.

Why the PCI SSC cannot just issue a single invoice for a firm and get it over with, I just do not know. I had to fax the invoice into the PCI SSC with my credit card information. They make it very clear that they have a secure fax server.

I will say this, I faxed in the invoice on Monday and by Tuesday I had my logon credentials for the training and examination. So the registration process is very quick.

The PCI SSC appears to have contracted with a new CBT provider that has better capabilities than last year’s provider. The site is simple but functional and easy to navigate.

I did have some issues with getting the training content to process properly. From time to time, I would get messages indicating that there was a “bad URL” supplied. This appeared to be related to timeout issues as I could click again on the content and it would eventually be displayed and played.

The training is broken into four modules. The first module covers the usual topics related to the PCI SSC, the various PCI standards, card processing and other general topics. The second module covers an overview of the PA-DSS, roles and responsibilities of the various PCI players, validation requirements and overview of the PCI SSC’s assessor quality management (AQM) program.

The third module is all about the PCI DSS v2.0. The fourth and final module covers miscellaneous topics such as virtualization, documentation required for Report Of Compliance, cardholder data discovery, scoping the cardholder data environment and compensating controls.

There are quizzes at the end of each module to test how well your retention is on the material covered. Each quiz is around eight questions and the questions seem to be representative of what is on the examination. According to the documentation on the Web site, this material takes around six and a half hours to cover.

The examination is comprised of 60 true/false and multiple choice questions. You are given four hours to complete the examination and, according to the documentation, you can pause the examination any number of times and come back at a later time to complete it.

You only get one chance to go through the examination, so being able to pause it is nice to have available should you get an interruption. I am not sure whether you can skip questions and come back to them later. It took me about 45 minutes to go through the test and I had some interruptions.

I liked the new Web site but was frustrated at times that content was not always available. I am not positive if the problem was at my end or the CBT provider’s. But since I was on a couple of different networks while I went through the content, I am guessing the problem was with the CBT provider as I got the content availability errors on all of the networks I used.

As with last year, the training slide decks are not available for download. I just do not understand why the PCI SSC does not make the slides and notes available as one or more PDFs.

Not only would it be useful for offline review, but it would also be nice to have as a reference. I am guessing that they feel that people who have the training material available longer than others have a better chance at passing the examination.

Of the four modules, module three is probably the best of the lot because of its discussion of the PCI DSS. Each of the 12 requirements is organized around:

  • The general concept of the requirement;
  • Understanding the requirement; and
  • Assessor recommendations.

The general concept of the requirement is just a re-iteration of what is in the preamble of the requirement as written in the PCI DSS. The Understanding discussion goes into a more detailed discussion of the various high points of the requirement (i.e., the X.1, X.2, X.3, etc. level).

Not only are these sub-requirements generally discussed, but there is also a discussion about why these sub-requirements are necessary. These first two items are very useful for training clients about why the PCI DSS process is necessary.

The real value though is with the assessor recommendations. For the first time, the PCI SSC goes on the record and states, in general terms, what types of observations, interviews and documentation need to be obtained and reviewed by the QSA to ensure the requirements are satisfied.

Based on some of the Reports On Compliance I have seen lately, I think a lot of QSAs are going to find out that what they are currently doing for fieldwork is not acceptable. This information would also go a long way to helping clients appreciate why a Report On Compliance takes the amount of time and money it takes.

The examination is similar to last year’s re-certification examination – a variety of true/false and multiple choice questions. The questions appear to be written to focus the QSA on black and white issues and to avoid any nuances.

For example, I had a true/false question that stated, “An application that processes, stores or transmits cardholder data sold to a single merchant by a software company must be PA –DSS certified.” Now, I know what they are trying to get at with this question and the answer is false. However, the real answer is not so simple and depends on the software vendor.

If we are talking MICROS as the vendor, there is a high likelihood that the software will be resold to more than just one organization, so the software should go through the PA-DSS certification process.

Regardless of whether or not software is PA-DSS certified, the bottom line is that a QSA is going to be required to assess the application for compliance with the PCI DSS and will have more work effort if the software is not PA-DSS certified.

In the end, the good news, or bad news for some of you, is that I was re-certified to be a QSA for another year.

Tuesday, February 22, 2011

Federal Cloud Computing Strategy Officially Launched

Monday, February 21, 2011


Kevin L. Jackson


Federal CIO Vivek Kundra officially launched the Federal Cloud Computing Strategy. While this is clearly not new news, the document does state the government's position in a very succinct manner.
  • By using the cloud computing model for IT services, we will be able to reduce our data center infrastructure expenditure by approximately 30% (which contributes to the estimated $20 billion of IT spending that could be migrated to cloud computing solutions).
  • Cloud computing can complement data center consolidation efforts by shifting workloads and applications to infrastructures owned and operated by third parties.
  • The shift to cloud computing can help to mitigate the fragmented data, application, and infrastructure silo issues associated with federated organizational and funding models by focusing on IT services as a utility.
  • Cloud computing can accelerate data center consolidation efforts by reducing the number of applications hosted within government-owned data centers.


Cloud computing allows the Federal Government to use its IT investments in a more innovative way and to more easily adopt innovations from the private sector. Cloud computing will also help our IT services take advantage of leading-edge technologies including devices such as tablet computers and smart phones.

The strategy document also highlight the necessary change in federal agency mindset. "To be successful, agencies must manage cloud services differently than traditional IT assets.As with provisioning, cloud computing will require a new way of thinking to reflect a service-based focus rather than an asset-based focus."

Security concerns are also address in a head-on, balanced manner:

"The Federal Government will create a transparent security environment between cloud providers and cloud consumers. The environment will move us to a level where the Federal Government’s understanding and ability to assess its security posture will be superior to what is provided within agencies today."

"The first step in this process was the 2010 Federal Risk and Authorization Management Program (FedRAMP). FedRAMP defined requirements for cloud computing security controls, including vulnerability scanning, and incident monitoring, logging and reporting. Implementing these controls will improve confidence and encourage trust in the cloud computing environment."

"To strengthen security from an operational perspective, DHS will prioritize a list of top security threats every 6 months or as needed, and work with a government-wide team of security experts to ensure that proper security controls and measures are implemented to mitigate these threats."

"NIST will issue technical security guidance, such as that focused on continuous monitoring for cloud computing solutions, consistent with the six step Risk Management Framework (Special Publication 800-37, Revision 1)."



Monday, October 25, 2010

Integrate ASV data directly into your IT-GRC System

Through our award-winning IT-GRC platform, SecureAware®, we recently completed an asset-based ASV proof-of-concept demonstration to a very large merchant. This organization has over 3,000 locations globally and manage over 90,000 network assets. The consideration for the integration of asset scan data was two-fold. First, our objective was to prove the ability to automate the process of integrating the raw scan data, by asset type, identified vulnerability, and recommended remediation plan. The remediation plan was also linked (by type / class) to the policy set for instantaneous access by the asset owner. The second objective was to demonstrate the ability to integrate this information into the workflow by assigning the vulnerability to a specific asset owner along with a scheduled completion date and the ability for the task to be tracked by not only the asset owner but also the supervisor and any other designated observers / interested parties. This is all being done in an environment that captures timestamp and associated documentation for complete auditability.

Our next steps with this merchant are to collect the specifications for integration of this data into their now-current network asset compliance system to augment internal tracking, improve workflow, increase visibility into IT risk management posture – all in an effort to reduce their costs of compliance in the long-run.


Gary B. Blume
Senior Vice President - Corporate and Business Development

Lightwave Security, Inc.
Atlanta, GA

Office: 404.939.8875
Mobile: 404.276.6192
Fax: 404.751.2830
E-mail: gblume@lightwavesecurity.com
Linkedin: http://www.linkedin.com/in/garyblume

Tuesday, October 19, 2010

How to implement ISO 27001 - Free Webinar!







Hello,

I wanted to let you know that we are organizing a free webinar called "How to implement ISO 27001?".

This free one-hour training is designed for organizations that plan to implement ISO 27001, and have no previous experience in such projects. This session will explain all the steps in ISO 27001 implementation, and provide tips on how to proceed with this complex task.


This webinar is in English, and covers the following topics:
  • Plan - Do - Check - Act cycle
  • ISMS scope
  • ISMS policy
  • Risk assessment and treatment
  • Risk assessment report
  • Statement of Applicability
  • Risk treatment plan
  • Annex A - overview of controls
  • Four mandatory procedures
  • Document management
  • Records management
  • Internal audit
  • Management review
  • Corrective and preventive actions

The webinar is delivered by Dejan Kosutic, the author at Information Security & Business Continuity Academy.

To register for this webinar, please visit: https://www3.gotomeeting.com/register/794135934
About the organizer: Information Security & Business Continuity Academy is the leading online resource for ISO 27001 and BS 25999-2 implementation. Visit http://www.iso27001standard.com/.


Best regards,

Dejan Kosutic

Monday, October 4, 2010

HIPAA Violations Not Always Due to Data Breaches

Contributed By:
Jack Anderson


On an early album George Carlin (RIP) talked about being raised Irish Catholic. Remarking on mortal sins he observed that if you woke up in the morning and decided to go across town and commit a mortal sin, you could save your bus fare because you already committed a mortal sin just by thinking about committing a mortal sin.

Similarly you don't have to have a patient data breach to be in violation of HIPAA rules and regulations. By doing nothing, not even thinking, you probably have already committed a violation.

For example, if you have a business associate (BA) agreement in place you are required to be compliant with the terms of that agreement, now . If you don't have a breach notification program in place you are in violation, now.

If you don't have a privacy program in place you are in violation, now.

But, you say, I am a small company and how would they know? Let me count the ways:

1.Your covered entity detects a pattern of non-compliance, like you sending unsecured PHI and is required to either help you fix the problem, or sever your contract, and report you to HHS.
2.A whistleblower, (employee, ex-employee, patient, ex-patient, wife, ex-wife, etc) reports you in hopes of collecting the reward offered by HHS.
3.An unannounced audit by OCR, the enforcement arm of HHS. They are required by Congress to audit and have hired an outside firm to begin auditing in Q4 2010.
4.A state attorney general files suite in federal court as allowed by The HITECH Act.
5.A patient data breach which must be reported.

The good news is that just starting on a compliance program earns you a lot of points. Also new cloud computing solutions are cost effective and efficient for even the smallest companies. A small company can get started for only $125 and can stay compliant and prove it for only $35 per month. This is less than your latte budget.

Tuesday, September 21, 2010

ISO 27001 vs. ISO 27002


Contributed By:
Dejan Kosutic


If you came across both the ISO 27001 and the ISO 27002, you probably noticed that ISO 27002 is much more detailed, much more precise - so, what's the purpose of ISO 27001 then?

First of all, you cannot get certified against ISO 27002 because it is not a management standard. What does a management standard mean? It means that such a standard defines how to run a system, and in case of ISO 27001, it defines the information security management system (ISMS) - therefore, certification against ISO 27001 is possible.

This management system means that information security must be planned, implemented, monitored, reviewed, and improved. It means that management has its distinct responsibilities, that objectives must be set, measured and reviewed, that internal audits must be carried out and so on.

All those elements are defined in ISO 27001, but not in ISO 27002.

The controls in ISO 27002 are named the same as in Annex A of ISO 27001 - for instance, in ISO 27002 control 6.1.6 is named Contact with authorities, while in ISO 27001 it is A.6.1.6 Contact with authorities. But, the difference is in the level of detail - on average, ISO 27002 explains one control on one whole page, while ISO 27001 dedicates only one sentence to each control.

Finally, the difference is that ISO 27002 does not make a distinction between controls applicable to a particular organization, and those which are not. On the other hand, ISO 27001 prescribes a risk assessment to be performed in order to identify for each control whether it is required to decrease the risks, and if it is, to which extent it should be applied.

The question is: why is it that those two standards exist separately, why haven't they been merged, bringing together the positive sides of both standards? The answer is usability - if it was a single standard, it would be too complex and too large for practical use.

Every standard from the ISO 27000 series is designed with a certain focus - if you want to build the foundations of information security in your organization, and devise its framework, you should use ISO 27001; if you want to implement controls, you should use ISO 27002, if you want to carry out risk assessment and risk treatment, you should use ISO 27005 etc.

To conclude, one could say that without the details provided in ISO 27002, controls defined in Annex A of ISO 27001 could not be implemented; however, without the management framework from ISO 27001, ISO 27002 would remain just an isolated effort of a few information security enthusiasts, with no acceptance from the top management and therefore with no real impact on the organization.

Thursday, September 16, 2010

Can A Business Continuity Strategy Save You Money?



Contributed By:Dejan Kosutic



You are thinking about implementing the business continuity management/BS 25999-2 standard? But then you hear it will cost you a lot? It probably will cost you, but not necessarily as much as you thought - this you can solve with good business continuity strategy.
Business continuity strategy, as defined in BS 25999-2 standard, is an "approach by an organization that will ensure its recovery and continuity in the face of a disaster or other major incident or business disruption".

Therefore, the point is to prepare yourself in the best possible manner to counteract a disaster if such would occur. This preparation can include organizational measures (drawing up plans, making contracts with suppliers/partners, exercising, reviewing, awareness raising, etc.), and measures including investment in equipment, infrastructure etc.

Time is a very important factor in recovery - if you do not recover your business in time, you will probably lose your customers and consequently lose your business as well. So the business continuity strategy must set the recovery time objective (RTO) for each of your critical activities, whereas RTO can be different for each of those.

One important consideration: the shorter the RTO, the bigger the investment you will need - for instance, if you want to recover your data centre in less than one hour, you will have to invest in an alternative location almost the same equipment as in the primary location; on the other hand, if you want to recover your data centre in two weeks, the investment will be much lower because it would be enough to store the backup tapes at the alternative location, allowing you two weeks to obtain the necessary equipment. All this means that your RTO must not be too long, but not too short either.

Once the RTO is set, you will still need to make some investment; however, with a good business continuity strategy you will be able to decrease that investment, while still being able to recover your critical activities within the recovery time objective. Here are some examples:

■you might not need your own data centre at an alternative location - in most countries you can rent such a location from a specialized company, which means you don't need to invest in infrastructure, maybe not even in equipment or software,
■you might not need offices at an alternative location - employees who do not have to meet customers face-to-face can work from their homes,
■you might not need an alternative location at all if you have other business units at different locations which could take over the critical activities affected by the disaster,
■you might not need to purchase equipment in advance if you can find the supplier that could guarantee the delivery of equipment within your RTO
In all these examples you will need to increase your organizational capabilities, but if you want to save some money, it sure is something worth thinking about.


Cross posted from ISO 27001 & BS 25999 blog - http://blog.iso27001standard.com