With the ratification of Payment Card Industry Data Security Standard (PCI DSS) version 3.1, passing PCI audits just became a little bit harder.
But can you blame the PCI Security Council for updating the standard so quickly? With all of the recent breaches in the retail sector, an update was bound to be forthcoming.
This won’t affect you, though, because all of your credit-card-processing applications are up to snuff and meet the requirements outlined in the PCI spec, right?
What about those mainframe applications? Those work-horse applications that continue to run the global economy behind the scenes (utilities, point-of-sale) and are therefore the last apps to be considered when it comes to security (for some reason!).
A little background on PCI may be in order here. In case you’re in IT but never had to ensure your systems are PCI compliant, PCI is a standard that companies must adhere to in order to protect credit card data that gets stored on computers and transmitted over networks.
Credit card data is stored and processed on virtually every platform being used for business in the enterprise, including IBM mainframes and a wide range of systems in the retail, financial, insurance, and healthcare markets.
Organizations that don’t properly protect this sensitive data end up in the (bad) news and endure hefty fines when they fail audits. Don’t be one of these.
Consider the key change in the recent PCI 3.1 update in how encryption is addressed.
At the time PCI DSS version 3.0 was published, it recommended TLS v1.1 or later for encrypting data. Then along came several SSL/TLS vulnerabilities, like Heartbleed and Poodle, which made this recommendation obsolete.
Hence, the quick update and ratification of PCI DSS to version 3.1, where there is recognition that encryption security is a moving target and therefore the requirement is stated less explicitly in that “… new implementations must not use SSL or early TLS.”
I’m not sure what “early” TLS is (Ok, it’s 1.1 today), but since 1.1 was current not that long ago, it only makes sense that even TLS 1.2 will also be considered “early TLS” not too far down the road.
How far down the road? It’s difficult to know for sure. But if the past is any indicator, 1.2 probably has another 6-12 months before a new version of the standard emerges that plugs the latest holes in 1.2.
The first step, then, in securing mainframe (and other host) applications is implementing encryption security and then having a plan for keeping it updated as vulnerabilities are found.
Large organizations with hundreds or thousands of workstations to update to the new standard should get started now to ensure they can first come up with a risk mitigation plan by the June 2016 cutoff date and then get busy on the actual migration to meet the new requirement. With the security landscape continuing to evolve, further updates to the standard are surely on the way.
Ambitious IT organizations in retail, financial, or healthcare sectors running business applications on host systems that want to take it up a notch can implement additional ways to secure applications, like redacting credit card data on IBM host screens and managing host access centrally.
Micro Focus bridges the old and the new, enabling customers to unlock the value in their core business applications. To learn more about securing the last mile of enterprise applications, visit www.attachmate.com.