Looking after an organisation’s estate of desktop and laptop PCs can be a difficult and thankless task. On the one hand, users want the freedom and flexibility to do anything they wish with what they perceive to be their own personal machine. On the other hand, if that machine isn’t working in the way it should, then it’s seen as IT’s job to put things right. Indeed, even though many PC-related problems are of the user’s own making, it’s too often IT that gets the blame for providing unstable or poorly performing hardware or software.
Over the years, IT departments have attempted to retain or regain control in various ways. Locking down PCs and withholding admin privileges from users has been one approach, though this typically isn’t very popular with users. Another tactic has been to centralise desktops so everything runs on a server, with the desktop environment accessed via a dedicated thin client terminal or thin client software running on a desktop PC. This approach was popularised by Citrix in the 90s and persists to this day.
The advantages of the thin client approach are numerous. There is far less to maintain and go wrong on the client device, which minimises operational and support overhead. With a consistent server environment, IT is no longer burdened by the need to cater for widely varying specs and configurations, which further streamlines operations as well as helping from a development and deployment perspective. Other advantages include the repurposing of older machines as thin client terminals, thereby deferring the capital investment that would normally be needed to replace them.
Today, there are essentially two ways of implementing thin client computing. Firstly, we have the traditional Citrix style approach in which all users are running off a single server instance of the desktop operating system and each application. The modern name for this is ‘session virtualisation’. The second thin client architecture available today is ‘VDI’, which is short for ‘Virtual Desktop Infrastructure’. In this model, each user has their own instance of the desktop operating system and associated applications running discretely on the server. It should be noted that Citrix as a vendor now supports both of these thin client approaches, as do other players such as VMware and Microsoft.
It is beyond the scope of this article to go into the pros and cons of session based virtualisation versus VDI; suffice it to say that VDI is generally more expensive to implement but offers more user flexibility. The big drawback of both approaches, however, is an absolute dependency on the network. Put simply, with any thin client architecture, the user needs a good connection to the server in order to access their desktop. This is obviously a showstopper for some types of user, especially mobile ones.
This is where so-called ‘application virtualisation’ (also known as ‘application streaming’) comes into play. The basic idea is to centralise all application configuration, maintenance, and management, and push out self-contained ‘application packages’ to desktops and laptops. These run locally, but with minimal dependency on the physical machine environment, e.g. applications are no longer reliant on the local registry or local libraries of software components and services. Conflicts between applications, which are a common cause of instability and performance issues, are therefore minimised.
In practical terms, application packages are ‘streamed’ to the client device when first invoked, and remain persistent on the user’s PC until a relevant change in the centrally held package is detected. At this point the changes are pushed down, or ‘re-streamed’, according to a predefined set of rules.
In addition to the approaches we have discussed, other techniques and solutions exist under the ‘desktop virtualisation’ umbrella to deal with different requirements and deployment scenarios. When considering options, it is therefore necessary to understand the pros and cons of each, and where they might be relevant in your particular environment. This in turn is dependent on having a reasonable understanding of your user base and the varying needs and constraints that exist across the segments within it.
When you do your analysis, the chances are that a mix of both traditional desktop PCs and different desktop virtualisation approaches will be appropriate. Get this right, and significant benefits can be gained for both IT and users, the latter often enjoying a more stable and/or more flexible desktop computing environment that better supports new working practices such as hot-desking, home-working and multi-device access in general.
For more information on the various desktop virtualisation approaches, along with some tips and tricks for evaluating options and implementing them successfully, we encourage you to take a look at our Desktop Virtualisation Smart Guide (sponsored by Microsoft) which is available as a free of charge e-book from here.
CLICK HERE TO VIEW ORIGINAL PUBLISHED ON GLOBAL KNOWLEDGE
Through our research and insights, we help bridge the gap between technology buyers and sellers.
Have You Read This?
Generative AI Checkpoint
From Barcode Scanning to Smart Data Capture
Beyond the Barcode: Smart Data Capture
The Evolving Role of Converged Infrastructure in Modern IT
Evaluating the Potential of Hyper-Converged Storage
Kubernetes as an enterprise multi-cloud enabler
A CX perspective on the Contact Centre
Automation of SAP Master Data Management