Paul Ayers, VP EMEA at Vormetric
Financial services organisations and banks operate across a vast tapestry of file servers, databases and operating systems. While rapid information flow is vital to the operation of these businesses, safeguarding sensitive information like payment card data in a manner that is compliant with the Payment Card Industry Data Security Standard (PCI DSS) presents a significant challenge. Conventional network perimeter controls alone do not provide sufficient protection and, frequently fall short of meeting increasingly stringent compliance mandates. By looking to the implementation of granular encryption right down at file-level, financial institutions can not only meet many PCI DSS directives in a cost-effective and transparent manner, but also achieve optimal security amid escalating cybercrime risks.
From the outset, securing heterogeneous IT environments can be a complex process, involving painstaking administration and cooperation between a variety of stakeholders within the organisation. While IT and information security teams must counter changing security threats and may want to limit data access, business teams need information access to gain competitive advantage. From a technical standpoint, organisations endeavouring to comply with the PCI DSS technical stipulations also face a number of practical challenges, from re-architecting existing networks, to updating software, to writing new maintenance policies. Many organisations are likely to struggle to find a balance when it comes to choosing a security solution capable of taking account of all these variables.
Fulfilling PCI DSS directives requires that any organisation handling payment data must fulfil a customary check-list of managing access control, encryption, key management, and auditing of cardholder data at rest. While PCI DSS is the impetus for many payments-centric companies to encrypt sensitive data, there are other pressing issues driving companies to reassess their security practices. The steady drumbeat of data breaches in 2012 showed us that data is under siege – with threats ranging from Advanced Persistent Threats, to outside hackers, to malicious insiders seeking to hijack and steal information. In addition, with increasingly stringent data protection directives on the horizon, regulators and law makers are embarking on designing refined legal frameworks and making plain the obligations for businesses to have adequate and concrete data security measures in place.
As the security and compliance landscape has evolved over time, organisations have typically adapted their systems and bolted on data security measures in an ‘after-the-fact’ endeavour. Unfortunately, such approaches typically result in a hodgepodge of encryption and key management islands that do not lend themselves to consistent data protection policies and lead to increased management overhead, and therefore operating cost. Some databases may have integrated encryption, but that encryption only applies to a single database vendor and typically only the most recent versions of the software. This approach does not work in a heterogeneous environment that includes structured database information along with unstructured information outside of the database, such as log files, extract-transform-load files, spreadsheets and other document files. Managing the security for both structured and unstructured data sets poses a challenge for most modern organisations. Caught between a rock and a hard place, they need to rethink how to consistently secure these various operating environments all with an eye to transparency, security, performance, and ease-of-use. Rather than looking at security to protect data at certain points, organisations need to implement data-centric protection measures, capable of locking down data regardless of where it resides, placing controls much closer to the data itself.
PCI requirements in the spotlight
According to requirement 3 of PCI DSS, entitled ‘Protect Stored Data’, data should be rendered ‘unreadable’ – the guidelines provide a number of methods by which that might be achieved. Among these methods are one-way hashes, truncation, tokenisation and strong cryptography. Therefore, if an intruder circumvents other security controls and gains access to encrypted data, without the proper cryptographic keys, the data is rendered unintelligible in the hands of any one not authorised to access it. Furthermore, requirement 3.4, which is increasingly pertinent to developments in today’s sprawling operating environment, states that data must be made unintelligible ‘anywhere it is stored’. As an increasing amount of institutions migrate to the cloud for operational flexibility and cost-saving practices, adhering to this seemingly simple mandate can quickly become quite complicated.
Encryption alone, however, is insufficient to provide the granular control required by the PCI DSS. Arguably, encryption is only as strong as the associated key management and access controls that accompany it. Key management operates to protect against the untoward activities of rogue insiders and is of utmost importance when complying with Requirement 7 of PCI DSS. This condition dictates that access to system components and cardholder data be limited to only those individuals whose job requires such access. Furthermore, it prescribes that access privileges are to be assigned to individuals based on their explicit job classification or function.
The access control functionality supported by a key management policy means that only those who hold keys to particular data sets are authorised to view them. However, special attention needs to be given to the role of ‘privileged users’ such as admin officials who frequently have unlimited or unnecessary access to company resources. Take for example the office IT administrator who should only have the authorisation to backup/restore files, while the application developer can be given the privilege to manipulate data. Considering a ‘five-factor’ access control system, compiled of who, what, where, when, and why allows organisations to enable context-aware access control and assign the appropriate level of access from there.
PCI DSS further requires that organisations must track access to cardholder data, and to systems and resources that can access it. According to the documentation, the ability to track these activities is “critical in preventing, detecting, or minimising the impact of a data compromise.” Such an audit capability can also serve to monitor and trace the activity of employees as they access resources and, in doing so; it can highlight any suspect deviations in user behaviour.
Approaching security and compliance as two sides of the same coin in this way, means financial institutions can conduct their business safe in the knowledge that they have met the core requirements of this primary mandate and that data is protected no matter where it resides.
Paul Ayers, VP EMEA, Vormetric. Ayers leads Vormetric’s channel programme in EMEA. He was previously sales director for PGP Europe and senior sales director for Northern Europe for PGP Corporation until its acquisition by Symantec.
Vormetric (@Vormetric) is the leader in enterprise encryption. The Vormetric Data Security product line provides a single, manageable and scalable solution to manage any key and encrypt any file, any database, any application, anywhere it resides— without sacrificing application performance and avoiding key management complexity. Some of the largest and most security conscious organizations and government agencies in the world, including 16 of the Fortune 25, have standardised on Vormetric to provide strong, easily manageable data security. Vormetric technology has previously been selected by IBM as the database encryption solution for DB2 and Informix on Linux, Unix and Windows; and by Symantec to provide the Symantec Veritas NetBackup Media Server Encryption Option. For more information, visit www.vormetric.com