Mainframe Integration Myth #2

by | Oct 27, 2021

Myth #2: You have to run mainframe integration software on the mainframe.

As we continue our mainframe integration myth series, thinking that legacy system integration software must be run on the mainframe is a myth, pure and simple. There certainly is pushback throughout the industry – among both end users and vendors – who previously thought running software off mainframe was an impossible solution. But innovative solutions exist today that make running mainframe integration software off mainframe a reality.

Users are often hesitant to utilize applications outside the mainframe, given the significant investment they put into the hardware – both in terms of dollars and run time – as well as its core function to house mission critical data. In years past, many users would prefer to code integration solutions themselves and undergo rigorous testing for at least a year to ensure effectiveness and security. Vendors, on the other hand, have a vested interest in keeping their users on the mainframe. Their end goal is to make transitioning away from the mainframe more difficult, as this keeps their platforms more viable.

In reality, mainframe integration software can be run off mainframe in a distributed, cloud-native environment, and it can run just as quickly as those operating on the mainframe. Options exist that allow users a choice in their deployment approaches, whether that’s on-mainframe as a z/OS started task or inside CICS, or off-mainframe in Windows and Linux virtual machines or Docker and OpenShift containers. All these can be run on-premise or in multi-cloud environments such as Azure, AWS, Google Cloud and others. Having a solution with these options offers flexibility for easy, rapid deployment in the environment that makes the most sense for the business.