The mainframe-Linux alliance turns 20 this month and is proving to be more vital than ever, primarily in the form of Big Iron-based Red Hat OpenShift. Credit: Thinkstock The mainframe has been declared “dead,” “morphed” and “transformed” so many times over the years sometimes it’s sometimes hard to believe IBM’s Big Iron still has an identity in the enterprise world. But clearly it does and in a major way, too. Take recent news as an example: According to IBM, 75% of the top 20 global banks are running its newest z15 mainframe, and the IBM Systems Group reported a 68% gain in Q2 IBM Z mainframe revenue year-over-year. At the heart of its current vitality is Linux—primarily in the form of Big Iron-based Red Hat OpenShift—and a variety of software such as IBM Cloud Paks and open source applications. The Linux-mainframe marriage is celebrating 20 years this month, and the incongruous mashup—certainly at the beginning anyway—has been a boon for the mainframe that, by most accounts, still has plenty of good years ahead of it. “For the first five or so years we really were just experimenting with what we could do with Linux and the mainframe, but then the server-consolidation movement hit, and we knew we had something big,” said Ross Mauri, the general manager for IBM Z. “What really got us going was the big Wall Street financial companies who all had these Sun Solaris servers with big databases, and many decided to consolidate on the Z mainframe running Linux, and we were off and running,” he said. Another contributing factor in 2000 was Big Blue’s $1B investment in all things Linux, which was a huge move toward getting the operating system and open-source software in general into the mainstream business market. Since that time there have been numerous milestones in the mainframe’s Linux journey, including the introduction of a standalone box, the LinuxONE, five years ago, which is now at the heart of some of the world’s largest implementations. Red Hat to the rescue The next chapter in the mainframe story began last year when IBM bought Linux powerhouse Red Hat for $34B, tying the massive transactional capacity, security and reliability of the mainframe with Red Hat Open Shift and Red Hat Enterprise Linux. IBM has also released Red Hat Ansible Certified Content for IBM Z and launched a new cloud-native development offering, Wazi Workspaces, which lets developers apply industry-standard tools from IBM Z to multi-cloud platforms optimized for OpenShift. Combine those moves with all of the open-source mainframe software work going on in the Linux Foundation’s Open Mainframe Project and customers now have a ton of development options for private or public cloud-based workloads. “Clients no longer have to develop and work with proprietary tools, and the zOS is being brought completely into the modern application development world,” Mauri said. Gartner recently wrote of that trend: “Now developers, testers, and infrastructure and operations staff have the capability to utilize the same tools which exist in the distributed world. Rocket Software, CA Technologies, and IBM are supporting the Open Mainframe Project Zowe, which is making adapting of open-source tools much easier.” IBM calls Zowe a framework that lets development and operations teams securely, manage, control, script, and develop on the mainframe like any other cloud platform. “From application development software, to complex DevOps orchestration engines, these traditional platforms are enjoying a resurgence of relevance in the data center that is making them accessible to all developers and testers. This change significantly reduces the issue of limited and old-fashioned development tools that previously contributed to the impetus to leave traditional platforms,” Gartner stated. Going forward Mauri said he expects a number of key technologies will continue to make the mainframe a key cloud and compute player. Confidential Computing One of those keys is an overarching security model called Confidential Computing which IBM broadly describes as a way to protect data, applications and processes at scale. It has rolled out a number products that adhere to Confidential Computing principles. For example, IBM’s Secure Execution for Linux software lets customers isolate and protect large numbers of workloads from internal and external threats across a hybrid-cloud environment. Other packages let customers bolt-down containerized Kubernetes workloads or Red Hat OpenShift clusters, IBM says. There is also a Linux Foundation project, the Confidential Computing Consortium, made up of Alibaba, Arm, Baidu, IBM/Red Hat, Intel, Google Cloud and Microsoft that is pushing the concept industry wide. “The organization aims to address data in use, enabling encrypted data to be processed in memory without exposing it to the rest of the system, reducing exposure to sensitive data and providing greater control and transparency for users,” the Consortium says on its website. “This is among the very first industry-wide initiatives to address data in use, as current security approaches largely focus on data at rest or data in transit. The focus of the Confidential Computing Consortium is especially important as companies move more of their workloads to span multiple environments, from on premises to public cloud and to the edge.” Mauri says IBM is on its fourth generation of Confidential Computing technology, which will keep it out in front of other industry cloud players and give the company a strong security weapon for the foreseeable future. “The vulnerability landscape is constantly changing, and organizations can be attacked across their IT systems. Add to that concerns around data privacy and regulations and you’ve got a full plate,” said Terri Cobb, lead alliance partner at Deloitte Consulting. Deloitte recently conducted a survey of business and IT decision makers with Forrester Consulting and found 80% of respondents are focused on modernizing mainframe toolsets in an effort to identify and prevent data breaches, and 73% are increasing their security footprint. “Data protection and security are so critical, and mainframes remain one of the most secure and powerful platforms available when the right controls are in place,” Cobb said. Pay per use Another direction IBM and the mainframe is moving is toward a more cloud-agile, consumption-based licensing model that lets customers pay only for what they consume, Mauri said. The company rolled out its Tailored Fit Pricing model in 2019 and has upwards of 80 customers onboard so far, Mauri said. It offers two consumption-based pricing models that can help customers cope with ever-changing workload and hence software costs. Mauri said IBM expects to make hardware more consumption based in the future. Predicting demand for IT services can be a major challenge, and in the era of hybrid and multicloud, everything is connected and workload patterns constantly change, Mauri wrote in a blog about the new pricing and services in 2019. “In this environment, managing demand for IT services can be a major challenge. As more customers shift to an enterprise IT model that incorporates on-premises, private cloud and public we’ve developed a simple cloud pricing model to drive the transformation forward.” ML/AI opportunities Others say technologies such as machine learning and artificial intelligence will also drive future mainframe development. “Data insights help drive actionable and profitable results—-but the pool of data is growing at astronomical rates. That’s where AI can make a difference, especially when it’s on a mainframe. Consider the amount of data that resides on a mainframe for an organization in the banking, manufacturing, healthcare, or insurance sectors. You’d never be able to make sense of it all without AI,” said Deloitte’s Cobb. As an example, Cobb said core banking operations can do more than simply execute large volumes of transactions. “Banks need deep insights about customer needs, preferences, and intentions to compete effectively, along with speed and agility in sharing and acting on those insights. That’s easier said than done when data is constantly changing. Now if you can analyze data directly on the mainframe, you can get near real-time insights and action. That makes the mainframe an important participant in the AI/ML revolution,” Cobb said. The mainframe environment isn’t without challenges going forward. For example, there is a growing market for moving mainframe applications off of the Big Iron and onto cloud services. Large cloud players such as Amazon, Google and Microsoft are also involved in modernizing mainframe applications. For example, Google Cloud in February bought mainframe cloud-migration service firm Cornerstone Technology with an eye toward helping Big Iron customers move workloads to the private and public cloud. An ecosystem of mainframe modernization service providers such as Astadia, Asysco, GTSoftware, LZLabs, Micro Focus has also grown up. COBOL coders needed Another challenge is finding and developing the right people to cultivate the mainframe environment. “It was predicted mainframes would eventually cease to exist so colleges stopped offering courses focused on COBOL and other critical mainframe skills. As Baby Boomers retire, mainframe talent concerns are becoming a reality,” Cobb said. Deloitte’s survey found that 71% of respondents said their teams are understaffed, and 93% said its “moderately” to “extremely challenging” to acquire the right mainframe resources and skills, Cobb said. “Many large companies are addressing this issue by hiring and developing college recruits, developing a mentoring program, creating an internship, or turning to third parties for support. Mainframes aren’t going anywhere—the talent pool needs to match the demand,” Cobb said. While there are challenges in the future, Cobb said the consultancy’s survey showed customers are looking to increase their investment in the mainframe with 91% of respondents identified as expanding their mainframe footprints as a moderate or critical priority in the next 12 months. Related content how-to Compressing files using the zip command on Linux The zip command lets you compress files to preserve them or back them up, and you can require a password to extract the contents of a zip file. By Sandra Henry-Stocker May 13, 2024 4 mins Linux news High-bandwidth memory nearly sold out until 2026 While it might be tempting to blame Nvidia for the shortage of HBM, it’s not alone in driving high-performance computing and demand for the memory HPC requires. By Andy Patrizio May 13, 2024 3 mins CPUs and Processors High-Performance Computing Data Center opinion NSA, FBI warn of email spoofing threat Email spoofing is acknowledged by experts as a very credible threat. By Sandra Henry-Stocker May 13, 2024 3 mins Linux how-to Download our SASE and SSE enterprise buyer’s guide From the editors of Network World, this enterprise buyer’s guide helps network and security IT staff understand what Secure Access Service Edge (SASE) and Secure Service Edge) SSE can do for their organizations and how to choose the right solut By Neal Weinberg May 13, 2024 1 min SASE Remote Access Security Network Security PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe