Virtualisation has come to the fore largely through the seemingly insatiable demand for x86 server consolidation, but as we know from this lab and elsewhere, its appeal is broader than that. Storage virtualisation has a longer heritage as a technology for example, and all the signs are that server virtualisation and storage virtualisation need to work in tandem to get the best out of each, although we know that understanding of the later lags well behind.
Meanwhile we have desktop virtualisation, which is more of an umbrella term for a range of technologies – application and graphics streaming, client-side hypervisors and so on. Server virtualisation options are also moving beyond the hypervisor model, and no doubt we shall see application streaming becoming an option on servers as well as desktops, for example.
Virtualisation has already emerged in other areas of IT. Virtual machines can be run on mobile devices, and application server software (such as that from Oracle/BEA) has also boasted offering virtual environments for their own workloads. Another area of potential is in the embedded systems space.
In other words, virtualisation is only going to become more prevalent and complicated, until it touches every area of IT. There will be more options for more scenarios across and between more platforms. From that point of view it will continue to be a topic of some interest. But at the same time, if virtualisation really does become part of absolutely everything, this could also mean that it becomes so commonplace, that it will barely merit mentioning as a separate entity.
“Oh, but hang on,” says a little voice at the back of my head, taking me right back to Winkel and Prosser’s Art of Digital Design. “IT isn’t real anyway, it’s all about electronic signals, right?” The little voice has a point – indeed, IT has long been about how good we are at abstracting computational tasks and data movements, from the physical hardware required to do the job. Mainframes got in early with virtualisation of course, and virtual memory has been a necessity ever since Bill Gates didn’t say “640K is enough for anybody.” Oh, and when was the last time anyone directly accessed bits in a storage system anyway?
So, if IT has always been about abstraction, it makes sense that even as we do more virtualisation, we’re going to be talking about it less. Abstraction is a means to an end – it only makes sense to package things in a way that supports the information and services to be delivered. A philosophical point perhaps, but one which gives us the fundamentals around cohesion and coupling which should still be the mainstay of good software development practice. It also provides the basis for best practice around SOA and business service management.
Ultimately, virtualisation gives us the opportunity to think about what IT does, in terms of workloads, information and service delivery, without having to use up as much time worrying about what IT is in hardware terms. We know from workshop feedback that these are early days, and it is premature to ignore the very real demands of hardware in terms of RAM or network bandwidth for example.
Through our research and insights, we help bridge the gap between technology buyers and sellers.
Have You Read This?
From Barcode Scanning to Smart Data Capture
Beyond the Barcode: Smart Data Capture
The Evolving Role of Converged Infrastructure in Modern IT
Evaluating the Potential of Hyper-Converged Storage
Kubernetes as an enterprise multi-cloud enabler
A CX perspective on the Contact Centre
Automation of SAP Master Data Management
Tackling the software skills crunch