Enterprise Data Caching

Intelligent Caching Engine

Improve the performance of your mainframe integrations and reduce costs with a distributed, in-memory cache.

What Is Intelligent Caching Engine?

Intelligent Caching Engine is a distributed, in-memory caching solution that improves API performance and reduces mainframe load by storing and reusing frequently requested data, enabling faster access and lower operational costs.s.

Modernizing Mainframe Performance and Cost Efficiency

Mainframe systems remain one of the most secure and high-volume transaction processing platforms available. Research shows that 71% of the Fortune 500 rely on mainframes for business-critical transactions and data processing (1).

As API-driven access increases, transaction volumes grow—driving higher processing demand and rising infrastructure costs. Organizations need a way to reduce mainframe utilization while maintaining performance and reliability.

High-Performance In-Memory Caching for Mainframe Integration

Adaptigent’s Intelligent Caching Engine is a distributed, in-memory caching solution designed to reduce mainframe integration load, improve API performance, and lower operational costs. It can reduce API response times by up to 10x–50x for mission-critical data and transaction calls.

The platform temporarily stores the results of repetitive data requests in a distributed memory cache running on commodity hardware. Cached responses can be retrieved instantly without sending repeated requests to the mainframe, reducing latency, lowering system load, and improving application performance.

Users can define which data elements are cacheable ahead of time using a no-code, visual interface, enabling precise control over caching policies and data access.

 

(1) One Advanced. 2020 Mainframe Modernization Business Barometer Report. Accessed 7 June 2023.

Improve Performance and Lower Costs

Free up resources on your legacy systems. Move MIPS to significantly improve response times with our Intelligent Caching engine.

In-memory distributed caching for mainframe data

Intelligent, policy-based caching control

Pre-emptive caching to reduce peak mainframe load

Partial caching at the transaction level

Combines cached and live data in real time

No-code, drag-and-drop interface

Intelligent

Traditional caching engines use a naïve proxy cache that collects and holds either the entire transaction or none of it. In contrast, our caching engine workflows allow users to choose and set caching policies at the individual mainframe transaction level. This granularity is a competitive advantage.

Preemptive

The engine can preemptively cache – or prefetch – data before it’s needed. By fetching and moving data from slow storage to faster local memory, companies can offload more mainframe processing when demand is high.

Partial Caching

The intelligent caching engine also supports partial caching. This means a single API request can contain both cacheable and non-cacheable elements. The runtime environment then intelligently pulls data from both the stored cache and from live data on the mainframe to fulfill the API request.

Frequently Asked Questions

How does Intelligent Caching Engine reduce mainframe load?

Intelligent Caching Engine reduces mainframe load by storing the results of frequently requested data in a distributed, in-memory cache. This allows applications to retrieve data without repeatedly calling the mainframe, lowering processing demand and improving system performance.

What makes Intelligent Caching Engine different from traditional caching solutions?

Unlike traditional caching solutions that cache entire transactions, Intelligent Caching Engine supports granular, pre-emptive, and partial caching. This allows teams to cache specific elements of a request and combine cached data with live mainframe data in real time for improved efficiency and accuracy.

What is pre-emptive caching and how does it work?

Pre-emptive caching allows organizations to load highly cacheable data into memory during periods of low mainframe demand. This reduces processing load during peak usage times and improves response performance for high-volume API requests.

Can Intelligent Caching Engine improve API performance?

Yes, Intelligent Caching Engine improves API performance by reducing the number of direct calls to the mainframe. By serving cached data instantly, it lowers latency and enables faster response times for modern applications and services.

Does Intelligent Caching Engine require coding to configure caching policies?

No, Intelligent Caching Engine provides a no-code, visual interface that allows users to define caching policies and control which data elements are stored. This enables both technical and business users to manage caching strategies without writing code.