Data Integration

Making Mainframe Modernization Easy With Qlik Data Integration and Microsoft

With Ashish Khandelwal, Mainframe Modernization Engineer at Microsoft, Mukesh Kumar, Principle Group Engineering Architecture Manager at Microsoft, and Tom Griggs, Global Partner Senior Manager at Qlik

Headshot of blog author Kevin Pardue. He wears business attire and smiles for a selfie in front of a wall with the Microsoft logo.

Kevin Pardue

5 min read

Two professionals looking at a tablet. Text: "Mainframe Modernization with Qlik and Microsoft Azure: How to Move Mainframe Data to the Cloud – in Real Time." Logos of Qlik and Microsoft are present.

If you work at an enterprise, you probably rely on a Mainframe computer to host mission-critical applications. Especially if you work in Banking or Financial Services. The Mainframe’s speed, capacity, intelligence, security, and accountability are all essential to the financial industry’s success. But Mainframes are indispensable for much more than just the Finance and Banking industry. For example:

  • 92 of the top 100 global banks leverage Mainframes

  • Over 20 of the largest commercial airlines are on Mainframes

  • All of the top 10 insurers use Mainframes

  • 71% of the Fortune 500 are running Mainframes

  • 90% percent of all credit card transactions are on Mainframes

It’s hard to imagine a world without Mainframes given the immense amount of data they generate, and the insightful information they contain which can drive new insights when joined with other enterprise data. When it’s accessible and blended with data from the rest of the organization it can drive increased revenue, greater efficiencies, powerful innovations, and competitive advantages.

The Top 3 Mainframe Data Integration Challenges

Unlocking insights from mainframe data can be costly and time-consuming. If it’s not done correctly, you run the risk of adversely impacting production systems. The question to ask is – is the lift required to make Mainframe data accessible and ready for efforts like analytics, movement to the cloud or downstream systems worth the effort, and what are the common pitfalls to watch out for. Here are just three:

Mainframe Data Access Process

The Challenges

Batch File Transfer

Scheduled scripts or mainframe jobs extract data from the mainframe and write the result into flat files. These large files must be transferred over the network and transformed into their target data structure such as a data lake.

Expired Data

With inherent delays in the process, data isn't delivered in real time – and it quickly becomes irrelevant for any application that requires fresh data. Today, that is most of them.

Direct Database Query

Most businesses looking to integrate mainframes into a broader analytical environment tend to take a brute-force approach, querying directly into the mainframe system to access the data they need.

High Costs + Network Latency

Each new query eats up more instructions, adding to the expensive millions of instructions per second (MIPS) monthly bill. Additionally, whenever a query is made, the system is disrupted.

Real-Time Data Streaming

To stream in real time, data must be moved immediately whenever a change occurs.

High Degree of Labor

Without the correct data integration architecture, it takes a significant amount of manual tuning to support the broad, deep and fast analysis businesses need.

**Taking the First Steps to Mainframe Modernization**

Getting Mainframe data to the cloud where it can be conformed and processed to blend it with other sources system data is the end state goal for Mainframe modernization. The reasons are many, including making data available for use by downstream systems, populating data lakes and warehouses, analysis and reporting, read-only workloads and AI and Machine Learning efforts.

Reducing labor-intensive and error-prone manual processes enable seamless, real-time data movement and transformation from Mainframe systems to a wide range of cloud computing services, and Change Data Capture (CDC) streaming technology can continuously integrate Mainframe data with other data on cloud platforms. The impact of CDC dramatically reduces time-to-value with automated mapping and data-model generation to modern cloud databases like Azure SQL, PostgreSQL and Cosmos DB, Data Warehouses like Azure Synapse and Data Lakes like Databricks Lakehouse and Azure Data Lake Storage Gen2.

5 Ways Qlik Data Integration Makes Mainframe Modernization Easier

  1. Access Near Real-Time Data - Data changes within the Mainframe are immediately replicated to the Azure Cloud, eliminating the need to move data in periodic batches.

  2. Keep Systems Up and Costs Down - Qlik’s log-based change data capture and log streaming are non-invasive and mostly agentless – so they have a low impact on production systems and do not incur the hefty MIPS price tag.

  3. Expand Data Availability - You can replicate, synchronize, distribute, consolidate, and ingest data not only from Mainframes but across all major databases and data warehouses, whether on-premises or in the cloud.

  4. Speed Time-to-Value - The intuitive, wizard-based GUI is easy to use, with no hand coding required.

  5. Enable Hybrid Options - Rarely does enterprise data exist solely on-premises or in the cloud; Qlik and Azure enable hybrid options

HOW DOES IT WORK?

Qlik Data Integration extracts data from 40+ sources in real time with log-based CDC technology for low-impact, high-performing, secure, and reliable connectivity. A near zero-footprint architecture offers automation and a low-code/no-code approach that is highly scalable, open, and flexible, fully supporting Microsoft Azure.

Diagram showing data integration from Mainframes to Azure Cloud via Qlik Data Integration Platform. Data flows through replication, transformations, and staging processes, then saves in various Azure data storage services.

Ready To Put Your Mainframe Data To Greater Use?

Qlik and Microsoft have partnered to take all the risk out of trying this this solution and make Mainframe modernization easy. The Qlik and Microsoft Azure Accelerator for Mainframe is a no-cost, proof-of-value trial that includes software and subject matter expertise for delivering real-time, analytics-ready data from your Mainframe systems to Microsoft Azure for analysis and action. Here’s what’s included:

A three-step process diagram: "Discovery," "Workshop," and "POC." Each step highlights key activities and expertise, focusing on Qlik, Microsoft, Azure, and Mainframe integration.

Visit Qlik to Learn More and Get Started or get more details at Microsoft's page.

Want to learn more about how to modernize your Mainframe? The key is leveraging real-time data and the cloud.

Ready to get started?