That thorny question of software licenses
The principle of server virtualisation is now pretty well accepted in the mainstream IT professional community, but Andy Buss, Tony Lock and I have been having a lot of conversations with IT professionals recently suggesting that software licensing can still be a problem.
On quite a few occasions, we have each heard stories about consolidation initiatives that have got bogged down in wrangles about the commercial and legal aspects of software deployment.
While most of the experience gained is from consolidation projects at the moment, thinking beyond these to the creation of more dynamic virtualised environments, ultimately to private cloud, the difficulties are likely to become even more pronounced.
After so many years of physical and virtual server hosting being an integral part of mainstream IT, it’s telling that many are still struggling to determine what’s legal and most cost effective in this context too.
When you analyse things, based on all of the input we have received, two basic problems exist:
1 The undermining of the link between value and cost when virtualisation technology is used to partition servers.
This arises because virtual servers generally only use a subset of the underlying physical server resources available. Sometimes this is implicit, as applications compete for shared resources, other times it’s more explicit, when a virtual machine (VM) is only allocated access to two CPUs in a four way box.
The emergence of multi-core and multi-threaded processors, and the option to allocate fractions of CPUs or cores to VMs, then complicates things further.
Some vendors have ignored all this and insist on charging a licence fee based on the full physical spec of the host machine. This approach is now less common than it used to be, but when it is applied it can restrict flexibility and/or lead to quite punitive costs that many consider to be unfair.
Others vendors have attempted to deal with the issue, but often in a manner that is difficult to understand or administer, and typically employing approaches that err on the side of their interests rather than their customer’s.
2 The flexibility that stems from decoupling the software layer from the physical server.
In a virtualised environment, applications are no longer tied to a specific machine, so the hardware underpinning a software installation is no longer necessarily persistent. Even in a manually administered setup, the ease with which virtual servers can be relocated between physical boxes means operations staff have a lot of freedom to optimise resource usage.
They are increasingly taking advantage of this as requirements evolve and hardware is naturally refreshed. But much of this activity is not catered for by traditional software licence terms.
Whether it’s compliance, cost or both you are concerned with, these kinds of problems are primarily associated with licences relating to systems software (operating systems, database management systems or application software) that have historically been based on the attributes of the chunk of metal used to run them.
Such challenges will obviously increase as assisted or automated provisioning and workload management become more prevalent.
And if we take the promise of cloud computing literally, we will ultimately be dealing with a continuous process of workload provisioning, re-provisioning and de-provisioning, sometimes even across the on-site/hosted boundary. It’s therefore ironic that some of the vendors at the forefront of advocating the cloud nirvana are arguably the least prepared for dealing with it from a licensing point of view.
The upshot is that wherever you are on the trail of more flexible and dynamic IT delivery, most IT departments should probably be prepared for some ’interesting’ discussions with software vendors in the future.
These will by definition have to go beyond the usual periodic haggling on price that is a natural part of the contract review and renewal process. The fundamental basis upon which software is licensed is being challenged.
Unless you do your homework and prepare well, some vendors, particularly those with a reputation for exploiting contract terms to maximise their revenue regardless of value delivered, will attempt to take advantage of your ignorance.
As part of this preparation, getting a better handle on what you already have installed, how (or even whether) it is being used, and how it is licensed, may be necessary, as software asset management is a weakness that often surfaces in our research. The last thing you need is having to rely on the vendor sales rep’s view of what you have in place.
We would also highly recommend talking to other organisations with similar environments and objectives. While vendors often write terms into contracts prohibiting the disclosure of specific terms and prices to third parties, it’s perfectly possible to exchange higher level experiences on approaches and the games being played without the risk of contract breech.
Having said all this, some software vendors are trying very hard to do the right thing, and we must remember that they are often on a learning curve too. Nevertheless, you still need to be careful as you negotiate your way into the world of dynamic IT.
CLICK HERE TO VIEW ORIGINAL PUBLISHED ON
Dale is a co-founder of Freeform Dynamics, and today runs the company. As part of this, he oversees the organisation’s industry coverage and research agenda, which tracks technology trends and developments, along with IT-related buying behaviour among mainstream enterprises, SMBs and public sector organisations.
Have You Read This?
From Barcode Scanning to Smart Data Capture
Beyond the Barcode: Smart Data Capture
The Evolving Role of Converged Infrastructure in Modern IT
Evaluating the Potential of Hyper-Converged Storage
Kubernetes as an enterprise multi-cloud enabler
A CX perspective on the Contact Centre
Automation of SAP Master Data Management
Tackling the software skills crunch