Have you ever stayed at a hotel that offers warm chocolate chip cookies at check-in? Their employees never ask you to wait for the next batch or apologize that the oven is preheating. You smell cookies and receive your treat in nearly the same breath.
But baking cookies is no small task; not even a bakery could ensure fresh cookies every moment. Instead, we benefit from the magic of a warming drawer.
Bake a batch or one cookie at a time
The tech equivalent of a warming drawer is caching. If mainframes are the ovens, the cache is the warming drawer. Mainframes are powerful, but in most cases, data must be pulled each time it’s needed. It’d be like opening and closing an oven door over and over to grab one cookie at a time.
In the same way, repeatedly pulling the same data from a mainframe is expensive, time-consuming, and inefficient. Customers don’t like waiting – on cookies or technology.
What data to cache
If your business needs a particular type of data daily, you can copy that data off of the mainframe and into a temporary cache instead. It’s like moving baked cookies to the warming drawer until needed – which will be soon.
Trading off capacity for speed, a cache typically stores a subset of data for a limited amount of time (1). The data is ready whenever needed but has an expiration rule to avoid data thrashing and inconsistency. Mainframe data caching is especially impactful in large-scale environments where multiple applications can access the same data simultaneously.
For example, a home insurance company may run an underwriting engine and a claims adjudication application simaltaneoulsy. Both call the mainframe for a customer’s address. By moving customer address data to the cache instead of pulling from the mainframe, the insurance company reduces the number of calls to the mainframe. The cache also likely provides lower latency.
High-demand cookie times
Taking it one step further, there’s also intelligent caching or smart caching. These can predict future data requests based on usage patterns (2). This would be like a warming drawer learning the peak cookie demand patterns and sending that data to the baking crew.
Another version of intelligent caching involves choosing roles for different types of data. This type of intelligent caching allows users to not only define what data elements are cached, but how long until they expire.
“Intelligent caching is especially helpful in orchestration-type APIs where some of the information is static and some is dynamic,” says Don Spoerke, product evangelist at Adaptigent.
There are other benefits. Adaptigent’s Intelligent Caching engine, for instance, provides:
- In-memory distributed caching solution for data returning from a mainframe
- Intelligent and explicit caching
- Support for dynamic and pre-emptive caching strategies
- A true no-code, drag-and-drop interface
- Reduced mainframe load
By fetching and moving data from slower mainframe storage to faster local memory, companies can offload more mainframe processing when demand is high. This reduces the load on the mainframe and potentially reduces operational costs and latency.
Happy customers and happy companies warm everyone’s hearts.