Enterprise Data Caching

Intelligent Caching Engine

Improve the performance of your mainframe integrations and reduce costs with a distributed, in-memory cache.

Your Mainframe – Modernized

Mainframe systems remain one of the most secure and high-volume transaction processing solutions available. Research shows that 71% of the Fortune 500 depend on mainframes for business-critical transactions and data processing (1).

And yet, legacy system operating costs increase annually. Many companies want innovative options to reduce costs while continuing to deliver critical solutions to the business and customers. Fortunately, connecting external APIs to mainframe applications opens up the potential for even greater processing without increased cost.

Not Your Typical Cache

Adaptigent’s Intelligent Caching Engine reduces mainframe integration load and operational costs.

The solution temporarily stores the results of repetitive requests for data in a distributed memory cache running on commodity hardware. It puts full control in the hands of users to explicitly define which data elements are cacheable ahead of time. A response that can be stored and retrieved and used later — without requiring a new request to the server. This provides an unprecedented level of granularity using a true no-code, visual interface.

 

(1) One Advanced. 2020 Mainframe Modernization Business Barometer Report. Accessed 7 June 2023.

Improve Performance and Lower Costs

Free up resources on your legacy systems. Move MIPS to significantly improve response times with our Intelligent Caching engine.

In-memory distributed caching solution for data returning from a mainframe

Intelligent and explicit caching

Support for dynamic and pre-emptive caching strategies

A true no-code, drag-and-drop interface

Reduces peak mainframe load

Intelligent

Traditional caching engines use a naïve proxy cache that collects and holds either the entire transaction or none of it. In contrast, our caching engine workflows allow users to choose and set caching policies at the individual mainframe transaction level. This granularity is a competitive advantage.

Preemptive

The engine can preemptively cache – or prefetch – data before it’s needed. By fetching and moving data from slow storage to faster local memory, companies can offload more mainframe processing when demand is high.

Partial Caching

The intelligent caching engine also supports partial caching. This means a single API request can contain both cacheable and non-cacheable elements. The runtime environment then intelligently pulls data from both the stored cache and from live data on the mainframe to fulfill the API request.