Delivering a secure information infrastructure

I recently had the task of writing an explanatory paper about Good Practice Guide (GPG) 13, a UK-government sponsored piece of guidance around “protective monitoring” – that is, being able to keep an eye on what’s going on in your IT environment in order to spot when security breaches happen.

Now, before you get all big brother, it was more about the very boring technical stuff – looking for unauthorised access to files, failed login attempts and so on. Though I would imagine that if an organisation wanted to breach staff privacy and compromise its own ethics, all manner of other good advice probably exists.

GPG 13 is nothing if not comprehensive, covering every aspect of how IT systems monitoring should take place. It checks all the right boxes in terms of people, process and technology. I have no doubt that an inordinate amount of effort has gone into ensuring its consistency, not only within the document, but also with other, equally comprehensive government guidance documents.

Indeed, there appears to be only one downside of this carefully crafted tome: that it would be completely unworkable in practice for the majority of organisations we research. The fact is that most businesses struggle to put security in place, for a variety of factors not least the downright complexity of today’s IT environments.

While we can see that the traditional view of security being “somebody else’s problem” is gradually improving, that doesn’t mean that security itself is becoming any easier to administer.

In fact, the evidence would suggest the opposite. Among the top issues cited in the poll above were data growth and the increased propensity for more distributed working practices. Right now, and despite the best efforts of security vendors, these areas cause issues that remain beyond the ken of many organisations to solve from a security perspective. If indeed, ‘solve’ is the right word given just how nebulous this whole area can be.

You could argue that it just means we all need to work harder and enforce policies more strongly. Indeed we’ve had plenty of feedback from the Reg audience suggesting it is possible to lock down specific IT environments. The tools are there, you tell us, they just need to be implemented in the right way. But like the fairground game, for every rat we manage to bang back into a hole, another one will pop up almost immediately. And the rats are getting smaller and faster moving, if the latest smartphone trends are anything to go by.

Meanwhile of course, the networks of tunnels they occupy are also becoming more complex. When my colleague Dale Vile conducted some recent research into exactly what is cloud computing, he did find that hosted storage was seen as a valid option, even though numerous questions around data security and privacy remain to be answered.

The point is not that these things are happening, you know that for yourselves. It’s more that we are still driven towards using traditional approaches to securing information which, with the best will in the world, don’t stand a chance of being achievable in practice. IT security relies on building frameworks with all the strengths and weaknesses of a house of cards. If you build it once and leave it well alone, it will remain standing for a goodly while. Try to change any one part of it, however, and the whole thing will come tumbling down and will need to be rebuilt from scratch.

Change comes from a variety of directions, not least from the top. You can build a fortress, but all it takes is one senior exec to want to put a nice window in the outer walls and all the efforts go to worms. This is exactly what happens in many organisations, and frankly, there may be sound business reasons for making any changes or introducing new risk. If, say, the best way of broadcasting a service is deemed to be through social networking, or if the top dog needs a set of files to go see a client and needs to access them from his/her laptop, then so be it. From a business risk perspective, the potential for security breaches needs to be weighed up against the potential for not doing business at all – and on a day-to-day basis.

I’m not saying that acceptable levels of IT security are impossible to achieve. It’s just that inch-thick best practice frameworks are never going to be an appropriate mechanism for dealing with areas of rapid change. Of course it’s easy for someone like me (who doesn’t actually have to do anything about it) to bang on about better ways of doing things, like I had all the answers, The truth is I don’t – but I do know we should be asking some pretty fundamental questions about how we approach security at all.

Returning to the rat analogy (and avoiding the temptation to mix metaphors and consider the lovely nest a rat could make out of a house of cards), it may be that we can learn from areas such as pest control. Nobody actually expects to be able to wipe rats off the face of the planet, even if it were a good idea. However, a combination of good hygiene, containment and well-documented procedures to deal with the occasional infestation might perhaps offer a better approach than trying to seal all the holes, as we currently do with IT security.

Whatever happens in IT over the next few years, what is clear is that things are not going to get any simpler. Perhaps it is time to move from onerous good-practice guides, to a far clearer understanding of achievable, workable and sustainable best practice. This doesn’t mean that we should throw away our door and window locks – these fit in the category of ‘basic hygiene’. However, just as we can’t disinfect every surface, neither should we be trying to restrict every movement for fear something might go wrong. If you have any advice in this area, do say.

Content Contributors: Jon Collins

Click here for more posts from this author

Through our research and insights, we help bridge the gap between technology buyers and sellers.

© Freeform Dynamics Limited - 2006 - 2024. All rights reserved. Unauthorised use of copy is not permitted.