Freeform Dynamics recently caught up with VirtualZ, an often overlooked vendor of connectivity tools that bridge the gap between the mainframe, cloud and on-premise IT platform worlds. The more we listened, the more we joined the dots between what niche companies like VirtualZ have to offer, and some of the bigger challenges and trends that have been unfolding in the mainframe space for a number of years now. Read on to see our logic…
For many IT professionals outside of the mainframe world, the IBM Z platform is respected for its high-end critical computing capabilities, but at the same time it is often perceived, especially by those not actively working with the platform, as somehow separate from the broader IT landscape. And in a different kind of way, the same is true in the opposite direction. We still come across mainframe specialists who remain suspicious of some of the trends taking place in the wider world – Agile Development, DevOps, Continuous Delivery – even cloud computing.
Separatist fallout
As a result of such separatist mindsets, we sometimes see strange decisions being made. If there’s a need to make use of mainframe data within an application running on an x86 Linux stack, for example, a project manager might elect to set up a whole ETL process to periodically copy records between the two environments.
This suits the application team because it minimises the interaction with their mainframe colleagues who they think are paranoid and too often hide behind red tape. Ironically it often also suits the mainframe team who work on the basis they don’t have to worry about meddlers on the IBM Z platform. Their responsibility ends once the data leaves the IBM Z environment.
The crazy thing is there’s often no one standing back and considering data governance, security and the cost of complexity in a platform independent manner. When data-level controls and management processes are applied in two places, in different ways and according to different standards, that not only represents a lot of duplicated effort, but also a monitoring, reporting and compliance challenge.
Actions like this run counter to a principle that when copies of data proliferate, you pretty much always end up in a costly and risky mess, with business value undermined as no one knows which version of the data represents the real or latest representation of the ‘truth’.
Double crazy
Beyond this data copying game, in extreme situations, we also see people making a business case to migrate whole applications off the mainframe because they perceive this would make them – and the data associated with them – more open and accessible.
This kind of move is most likely when a ‘cloud first’ mindset has taken root in the organisation, which in turn (as we have written before) can lead to platform decisions being made on a subjective rather than objective basis. Fortunately, with most organisations having now experienced runaway public cloud costs due to unfavourable, complex and/or changing fee structures and contract terms, this is becoming less of an issue.
However, we still see migration decisions based on an ignorance of the mainframe cost model and platform attributes, which actually make it a better option than public cloud for many workloads, and this is not in the best interest of the organisation.
Hybrid thinking
During our research at Freeform Dynamics over the past few years, we’ve spoken with a number of very switched-on senior architects and CIOs that see the IBM Z as an integral part of their overall computing environment. Such people are often (though certainly not always) found in the banking sector.
These cross-platform pragmatists tend to think in terms of a triangle of deployment options when considering where applications and workloads should run, with the mainframe, public cloud and non-mainframe on-prem platforms all regarded as equally legitimate. Application placement or migration decisions are then primarily driven by the principle of matching workload requirements with platform attributes. This might consider things like performance, scalability, security, cost models, proximity of interdependent elements, and so on.
Going hand in hand with this approach is typically a good knowledge of how the modern mainframe is a lot more versatile and open than its ancestors. We’ve discussed this many times before (see here, here and here), but suffice it to say that if you ask whether a current standard, protocol or delivery model is now supported by the IBM Z, the answer is almost certainly yes.
The trouble is that the influence of hybrid-IT thought leaders is limited.
Below the radar decisions
If you speak to enough project and operations level people from the mainframe and non-mainframe worlds about the practicalities of how the IBM Z can take its rightful place in today’s cloud dominated and AI preoccupied world, a lack of awareness of mainframe integration is still very much evident. That’s not a criticism of anyone, it’s just a function of the fact that people don’t know what they don’t know, and it often doesn’t occur to them to look for solutions that come at problems from a different direction.
And solutions are even more likely to be overlooked if they fall outside of all the glamorous emerging tech developments promoted as part of widely marketed narratives from larger players. Put simply, if it isn’t inherently sexy or trendy, or heavily promoted by top tier mainframe vendors, the chances are you will not hear about it.
Which brings us to VirtualZ, who might not appreciate the way we’ve just implicitly positioned the company, but in our view, ‘sexy’ and ‘valuable’ rarely correlate directly!
Pragmatism trumps hype
During our recent discussion with VirtualZ, we were struck by how the company has taken a laser-focused approach to solving some very specific but important problems. Rather than getting caught up in grand transformation narratives, it has developed targeted solutions that deal with practical data and application integration challenges head-on.
Take its Lozen product, for example, which provides real-time, bi-directional access to mainframe data from applications running anywhere – whether that’s in the cloud, on distributed platforms, or even on Linux partitions running on the mainframe itself. The key here is that it preserves all of the native mainframe data structures and access methods, which means applications can work with VSAM files and other mainframe data formats without modification or complex transformation processes.
Breaking down artificial barriers
This kind of capability removes the need for many of those ETL processes we mentioned earlier, along with the associated data copying and synchronisation overhead. More importantly, it means you can modernise your application landscape without having to make binary ‘migrate or stay’ decisions. Applications can be placed wherever makes most sense from a business and technical perspective, while maintaining seamless access to the data they need.
PropelZ, another product from the VirtualZ stable, complements this by providing a no-code solution for those occasions when you actually do need to move or replicate data – for example, to feed analytics systems or create data lakes. Again, the focus is on removing complexity and risk, while preserving the integrity of mainframe data structures and access methods.
Perhaps most challenging to conventional thinking is the company’s Zaac solution, which allows mainframe applications to use cloud storage as if it were native DASD (Direct Access Storage Device). This brings the kind of flexible storage options that distributed systems have enjoyed for years to the mainframe world, but without compromising on the platform’s renowned security and reliability characteristics.
The elephant in the room
Speaking of security, this is obviously a critical consideration when talking about mainframe data access. Here again, VirtualZ takes a pragmatic approach, leveraging existing mainframe security protocols rather than trying to reinvent the wheel. This means organisations can maintain their robust mainframe security model while still enabling controlled access where it makes business sense.
The bigger picture
What’s particularly interesting about VirtualZ’s approach is how it aligns with the broader industry shift towards hybrid IT that we mentioned earlier. Rather than forcing organisations down a particular path, it provides the tools to make informed choices based on business needs and platform capabilities.
As we continue our research in this space, it’s becoming increasingly clear that the future of the mainframe in the context of broader enterprise IT isn’t about choosing between platforms, but about making them work together effectively. Companies like VirtualZ, while perhaps not grabbing the headlines, are doing the unglamorous but essential work of making this possible.
The bottom line is that if you’re looking to break down the artificial barriers between your mainframe and the rest of your IT estate, solutions exist. You just need to know where to look for them.
Dale is a co-founder of Freeform Dynamics, and today runs the company. As part of this, he oversees the organisation’s industry coverage and research agenda, which tracks technology trends and developments, along with IT-related buying behaviour among mainstream enterprises, SMBs and public sector organisations.
Have You Read This?
Generative AI Checkpoint
From Barcode Scanning to Smart Data Capture
Beyond the Barcode: Smart Data Capture
The Evolving Role of Converged Infrastructure in Modern IT
Evaluating the Potential of Hyper-Converged Storage
Kubernetes as an enterprise multi-cloud enabler
A CX perspective on the Contact Centre
Automation of SAP Master Data Management