Jason Bloomberg, President, Intellyx
Caching engines are a common, important tool in any distributed computing architect’s toolbox. Caches reduce the time and effort to fetch content or execute queries on databases, document stores, web servers, and other infrastructure elements.
At first glance, caches are deceptively simple: instead of fetching information from the source, store it in the cache. Now subsequent requests can hit the cache instead, reducing latency and lightening the load on the back-end database, server, or other system of record.
This straightforward but powerful value proposition has led to the creation and maturation of several open-source caches on the market, including Redis, Memcached, Apache Ignite, NGINX, and several others. These products support throngs of distributed computing applications, helping them run at massive scale.
One might wonder, therefore, if there’s an opening in the market for a new commercial cache offering. Adaptigent (formerly GT Software) certainly thinks so, as the company recently launched its new Intelligent Caching engine.
In spite of the relative maturity of open-source caches on the market, Adaptigent has crafted a product with a well-differentiated value proposition that will appeal to a mainframe-centric customer base who will find that the other products on the market fall short. Let’s take a closer look.
Caching Mainframe Data
Adaptigent has long specialized in unlocking the value of mainframe data via no-code integration technology. Its Intelligent Caching engine unsurprisingly focuses on this important segment of the data access market.
The value proposition for mainframe data caching is twofold. The first, like the other distributed computing caches, is to make data more easily available with lower latency.
The second part of the story, however, is specific to the mainframe, as IBM charges customers by the MIPS (millions of instructions per second) – and database queries can potentially consume many of them.
As a result, supporting modern application requirements with mainframe-based systems of record can be unexpectedly expensive. Imagine if every request in your mobile banking app required MIPS on the mainframe to execute – and then multiply by the number of customers at each bank.
Caching is an obvious solution to this problem. Instead of requiring every mobile banking transaction to hit the mainframe, cache the results instead. As a result, it would be possible to respond to many such requests from the cache without requiring any mainframe processing at all. It sounds promising, right?
Well, as you might expect, the devil is in the details.
Adaptigent’s Secret Sauce
Traditional caching engines approach caching as an all-or-nothing affair: either cache the result of the entire transaction or cache none of it. The common name for this approach is ‘naïve’ caching.
If you’re caching, say, the HTML for a web page or an entire pdf document, then this naïve approach makes sense. But in the case of mainframe transactions, the result of a request may include information that changes frequently along with other content that changes only on occasion, thus limiting the value of caching.
The Adaptigent Intelligent Cache engine resolves this problem. It provides workflows so users can configure and set granular caching policies that target parts of a transaction differently from the rest of the transaction.
Furthermore, traditional caching engines have no way of knowing what data they will store until one request comes through. At that point they may cache the result of that request in order to serve future requests for the same data.
This as-needed caching strategy works well when the number of requests for cached data is large compared to the total number of requests. But if most of the requests are for different data, then the caching engine spends an inordinate amount of time and processor cycles caching data that won’t be useful enough to warrant the effort.
To address this problem, Adaptigent Intelligent Caching offers pre-emptive caching. Instead of routinely caching results of new queries, the integration designer plans ahead for what data the engine should cache, and at what intervals.
The benefits to pre-emptive caching are twofold: it’s possible to avoid caching data that won’t benefit from being in the cache, and it’s also possible to schedule caching at periodic intervals that correspond to periods where demand on the mainframe is low – hence reducing the MIPS costs.
The third important capability Adaptigent Intelligent Caching offers is partial caching. With traditional caching, a single API request will either hit the cache or the underlying source system. Adaptigent, in contrast, recognizes that each API request may have a mix of cacheable and non-cacheable elements.
In such cases, the runtime environment will selectively pull data from the cache as well as live mainframe data to fulfill the request on the fly, without requiring any further interaction from a person.
Adaptigent Intelligent Caching’s final important differentiator is its no-code interface. As opposed to the open-source caching engines with their command-line interfaces, Adaptigent provides a simple drag-and-drop interface, making configuration of the cache straightforward for a broader range of individuals as compared to open-source products.
The Intellyx Take
Different organizations will find different aspects of the Adaptigent Intelligent Caching engine to be particularly useful. Generally speaking, however, its most important capability in terms of ROI is likely to be the fact that operators can shift cache operations to low-demand periods, thus lowering mainframe MIPS costs.
It may be possible to accomplish a similar benefit with the open-source caching engines – up to a point. Some mainframe-based transactions today may only require last night’s data to be sure. But in the general case, last night’s data is not 100% adequate.
Other caches will have no choice but to refresh the cache at that point, thus negating any savings from the cache operations. Adaptigent, in contrast, only needs to update the parts of those transactions that must be current – while fetching the bulk of the information from the cache.
Copyright © Intellyx LLC. Adaptigent is an Intellyx customer. Intellyx retains final editorial control of this article.