Mainframes are primed to churn through and store huge amounts of data every day, and the traditional workflow has long meant that batch processing is a prescheduled, after-hours event. But is that workflow still the right solution at a time when IT is being asked to move faster?
In a recent SHARE presentation, Rebecca Levesque, CEO of 21st Century Software, explained why she believes batch processing needs to modernize to keep pace with the rest of the mainframe environment and enable business innovation. One need only consider the standard batch processing workflow to identify areas of potential improvement, she said.
Batch processing involves processing large amounts of data, often terabytes or more. This data includes any of the common records that provide the foundation for a business’s day-to-day operations, such as a bank’s customer accounts information, or a retailer’s inventory count. Traditionally, batch processing jobs could be scheduled during the day and run during a period of downtime for the business – most often after hours.
When the processing is complete, businesses receive reports with business statistics they can use to make decisions (“Do we need to order more inventory?”). And some of the data could be exposed to users – like when you check your account balance at an ATM.
The batch process as we know it has existed, largely unchanged, since the days when system operators manually fed punch cards into machines, Levesque said. That’s a problem, she argued, because modern business requires companies to be able to access and act on the latest data faster and on a continuous basis. The rest of the mainframe environment is being modernized to adapt to this new speed of business innovation, but batch has lagged, she said.
“When it comes to batch, we don’t seem to want to hurry it up,” said Levesque. “We want to go with existing processes that have been around forever. It’s impossible for us to say we’re going to have an infrastructure modernization project in the mainframe if we’re not going to seriously look at how we use analytics and automation in our batch process.”
Those two capabilities – intelligent storage analytics and automation – are vital to making batch a more valuable process to the rest of the business, she said. Operators need to be able to eliminate as many manual aspects of batch processing as possible in order to speed up this function, including the faster delivery of those business insights (e.g., inventory, billing, and other records) that help companies make decisions.
As an example, Levesque described the often-fragmented data recovery process. Traditionally, if a disruption occurred that required operators to restore previously saved data, it would require a lot of manual interactions and work, including discussions with each individual application owner to find their backups and conduct restores. It could take several hours for processing to resume, a window of time that’s not acceptable given the speed at which businesses move today. If backups are taken automatically at appropriate times, the ability to discover the location of those backups could happen quicker, as well as run restore JCL and retrieve the app data, then processing could resume within an hour or so. Intelligent automation and analytics also help operators identify data interdependencies and vulnerabilities, which ensures that when the restore is complete, it yields uncorrupted data.
Mainframe analytical insights are vital to the health of the entire enterprise IT footprint, Levesque said. Analytics can provide a comprehensive view of applications across the mainframe, and make it easier for operators to ensure continued system resiliency and the availability of critical business applications. As a next step, she encouraged system operators to take advantage of technology that will help them automate, simplify, and conduct analysis during batch processing.
Most of all, these insights can help put the people who manage batch operations at the forefront of the discussion around the mainframe’s contribution to overall business value. The data being processed by batch jobs contains data with huge potential for corporations. Removing roadblocks to the speed at which this data is processed and made available will give mainframe professionals a bigger seat at the table for important IT discussions.
Check out the SHARE Communities for more resources on important issues in mainframe, including technology, training, and industry trends.