Server Virtualization in Context

by Dale Vile and Jon Collins

One of the hottest topics in the IT industry at the moment is virtualization, particularly in relation to the x86 servers that have so often proliferated to unmanageable levels. While the principle of server consolidation based on the latest virtualization technologies is now accepted, how far have organizations progressed in this area? And based on adoption experiences, what are the practical considerations when dealing with server proliferation?

KEY FINDINGS

Wake-up call: organizations generally have more physical servers than applications
Feedback from a recent study suggests that the IT infrastructure in larger organizations is often supporting several hundred, if not a thousand or more, applications, with even smaller businesses supporting software portfolios in the 10 to 50 application range. While this may be familiar, the wake-up call is that applications are generally outnumbered by the physical servers on which they run. As a result, 85% of respondents highlight existing or emerging issues with server proliferation.

Server proliferation is a function of cultural as well as technical factors
Historically, new applications have been installed on their own dedicated hardware, regardless of whether the full capacity of a server is required – this avoids conflict with other applications, and enables each box to be tuned to run an application in an optimum manner. However, the dedicated server approach reinforces the (administrative and political) expectation of business stakeholders owning everything associated with the applications they fund, with the server and other equipment allocated to their own cost centre.

The consequences of server proliferation are real, but can be tackled
Server sprawl has a direct, negative impact on routine activities such as patch management, application provisioning, and general monitoring and management of performance. This has a knock-on effect with regard to operational overheads and associated costs. Server proliferation also goes hand in hand with poor server utilization and power and space related challenges, which not only translate to elevated costs, but can also constrain development and growth. Those who have server proliferation under control demonstrably suffer significantly fewer problems in all of these areas.

Virtualization technologies are key to driving improvements

Quantitative and anecdotal evidence suggests that there are clear and tangible benefits to be gained from the implementation of virtualization technology to consolidate and rationalize x86 server estates, and experience in the mainstream is being accumulated rapidly. With the solution landscape still developing, however, it is important to monitor the way in which offerings are evolving in terms of pricing, bundling and capability, e.g. something that looked current a year ago might not do so today.

Adoption experiences highlight the importance of forward planning
When adopting any new technology, it is important to ensure that new problems are not being created for the future, e.g. for the unprepared, unwanted proliferation of physical servers can so easily be replaced by virtual server sprawl. Understanding implementation and management best practice, and planning accordingly, will reduce the risks and enhance the returns from your virtualization activity.

The study upon which this report is based was independently designed and executed by Freeform Dynamics and executed in collaboration with The Register news site. Feedback was gathered via an online survey of 301 IT professionals from the UK, USA, and other geographies, and an interactive ‘reader workshop’. The study was sponsored by Microsoft.

Content Contributors: Dale Vile & Jon Collins

Click here for more posts from this author

Through our research and insights, we help bridge the gap between technology buyers and sellers.