It goes without saying that IT systems should, in principle, be secured so that only properly authorised users can access data, applications or services.
Most companies are pretty good at giving authorised users access to the systems and data they need to do their jobs. The trouble is that this is only half the job. It is just as important, if not more so, to keep unauthorised users from accessing systems or data, or to keep authorised users from getting a little too creative with their use of the data.
Most companies control access to IT systems and services with identity and access management (IAM) products. These range from simple directory services with file access based on security access tokens, to more advanced capabilities with single sign-on, enabling seamless access across multiple business systems. Such products may be effective in a controlled environment, but in the complex reality of most organisations, a number of problems may arise. One of the biggest issues is that IAM products alone cannot protect information and data sufficiently from unauthorised access, and additional tools are needed to achieve this.
Part of the problem is that the evolution of files and file systems has been based around the notion that access can be completely controlled through a combination of the authentication provided by the directory services, and the security capabilities of the file system, such as NTFS. In this world, the data is only protected by being physically secure, as anybody with physical access to servers could remove the disks and install them in a different computer to gain access to the files and data contained on them.
This approach may have worked in the days of physical servers that were under lock and key, and with desktop PCs that never left the building. But the reality today is that notebooks are constantly on the move beyond the network perimeter, servers may be stolen along with their disks, or disks may even be sent for disposal without the data on them being securely deleted.
The simple fact is that relying on file system security for protecting access to data is not going to be enough. An updated approach is required to ensure that the integrity of the company’s information is protected wherever it may be residing or however it is being used. We can break this down into two areas of focus. The first is protecting the data in its raw form as it rests in the file system or flows through the network. The second is controlling how the data or information is used once authorised users have gained access to it.
Ensuring the confidentiality of data in its raw form is most effectively achieved by using encryption. Despite the maturity of encryption products, uptake has been sluggish, with investment spurred mainly be expensive leaks or legislation and compliance initiatives. This has resulted in encryption mainly being deployed in trouble ‘hot spots’ such as on notebook PCs or backups to protect information should it be lost or stolen.
While this is a beginning – after all something is better than nothing – a more comprehensive approach will be required to cover the encryption of data in all locations, even servers in the data centre. This will become more and more relevant as new approaches to computing, such as Cloud, become more widely accepted and used.
Encryption is no easy ride however. It is something that could be implemented in a piecemeal fashion, but the danger is that compatibility and operational costs (for example around managing encryption keys) will become just as big an issue over the long term as not protecting the data. It makes sense to investigate how encryption can be implemented across the business, if only to understand how the associated complexities might be addressed.
Protecting data in use is just as important. To bring the point home, it only takes a rogue employee with legitimate access to data to then try to profit from it. Some checks can be made on inappropriate use through logs and audit trails but this may often be too little, too late.
The key to protecting data in use is to understand both the data that is being accessed, and the context in which it is being used. While this may seem like a pipe dream, tools and technologies are being developed to try to bring some order to this chaos.
Data classification systems often run in the data centre to identify information that may have a sensitivity or risk to the business, while data leakage prevention systems look to prevent the dissemination of sensitive information to unauthorised recipients. The difficulty is that these tools are currently fragmented and still maturing. At this stage it makes sense to implement them based on specific pain points, but longer term the value will lie in bringing them together.
In this mobile and virtual age in which we now live, it is clear that doing the same thing as in the past is no longer an option. Whichever approach you take to data protection, be it full-on encryption, data leakage or just tackling problem hotspots, it will be a step in the right direction, as long as you take the wider operational questions into account.
Content Contributors: Andrew Buss
Through our research and insights, we help bridge the gap between technology buyers and sellers.
Have You Read This?
Generative AI Checkpoint
From Barcode Scanning to Smart Data Capture
Beyond the Barcode: Smart Data Capture
The Evolving Role of Converged Infrastructure in Modern IT
Evaluating the Potential of Hyper-Converged Storage
Kubernetes as an enterprise multi-cloud enabler
A CX perspective on the Contact Centre
Automation of SAP Master Data Management