Skip to main content

Top 5 architectural challenges for modern mainframe computing

According to the experts, these challenges—and solutions—are the biggest pitfalls facing modern mainframe computing.
Image
Any large enough monolith can break down into a microservice.

Photo by timJ on Unsplash

In the eyes of many, mainframes are a powerful yet underappreciated asset in today's enterprise architecture. While much of the architectural world is focused on data centers powered by racks of x86 machines, the fact remains that a significant amount of corporate data on the planet lives on a mainframe.

The actual usage is hard to pin down. One study says that 80% of all corporate data resides or originates on mainframes. Another study offers a less ambitious, but still impressive, 40-50%. A third says that 70% of Fortune 500's use mainframes. There seems to be no common agreement, but even at the low end, 50% of the corporate data in the world is a lot of data!

The reason that mainframes continue to play an important role in corporate IT is because they are computing powerhouses. A single installation can handle 2.5 billion transactions daily, which is the equivalent of 100 Cyber Mondays! However, along with the power they bring to the table comes challenges that need to be addressed.

Thus, to understand what these challenges are and how they can be met, we at Enable Architect interviewed five experienced Enterprise Architects who are experts in mainframe technology. According to those interviewed, the top challenges are:

  1. Making sure the technology really addresses the customer's problem
  2. Better integration of Database Administrators into DevOps practices
  3. Finding and reducing shadow IT
  4. Improving mainframe tool standardization for effective integration into DevOps deployment patterns
  5. Avoiding misguided modernization efforts

Let's examine the details of our findings more closely.

1. Making sure the technology really addresses the customer's problem

As powerful as mainframes are, the applications that run on them are only as good as the development lifecycle process under which the software is created. Some mainframes shops have adopted approaches in line with the Agile mantra of continuously delivering every-improving software that meets user needs. Yet, for many projects, meeting the real needs of the customer has been a challenge.

As Bill Bitner, lead on IBM's z/VM Development Lab Client Focus & Care team pointed out in a recent interview with Enable Architect, "There were times where we would create technology on the platform that was innovated, generated patents, and was cool technology. However, it would sometimes miss the mark. Or would be interesting technology to mitigate a problem, but did not always get to the pain point or root cause of the pain."

In his work as a customer advocate, Bitiner discovered that there were times that such misalignment was caused by the most benevolent of intentions. Proponents become so enthusiastic about a technology's potential that they concentrate on what they thought the customer should want instead of focusing on what the customer actually needs. A telling example is around documentation. According to Bitner, "I will sometimes hear that the platform ... is 'difficult' to learn. [W]e are trying to make the time to onboard z/VM skills shorter, but what I often point out is that some of it is not the difficulty, but that there is so much capability and therefore much more to learn."

Early attempts to meet the perceived need resulted in voluminous documentation that was hard to understand and hard to absorb. Bitner advocates a different approach. According to him, "we are shifting from documentation that tells the customer 'everything we know' to documentation that tells them 'what they need to know'. That shift is difficult for some of us as we love the technology so much, of course we want to tell people everything." Bitner advocates for intentionally architecting the developer experience, as we recently wrote about.

Bitner adds, "I think in the past five to ten years, we've shifted back to being better listeners. We have done a much better job of understanding the pain points and the needs of our clients, and then bringing technology and innovation to create solutions."

The key takeaway here is that listening to the real needs of end-users is hard, particularly when those making the technology are enamored with its capabilities. This is as true for mainframes as with any other technology. Sometimes it's easy to lose sight of the forest by focusing too much on the specifics of the trees.

2. Better integration of Database Administrators into DevOps practices

In the past, it was all too common for a database developer to change the name of a column in a table only to have that change wreak havoc to every downstream SQL statement that used the table. For the most part, there was no direct connection between the code and the database. In this way, aligning database development with application programming has been a difficult undertaking.

According to Craig S. Mullins, President and Principal Consultant at Mullins Consulting, Inc, IBM endorsed Champion for Data and AI and IBM Db2 Gold Consultant, this a significant challenge. As Mullins shared in a recent interview with Enable Architect, "Supporting the database has lagged behind in the world of DevOps, especially in the mainframe world. DBA procedures are part of the Ops in DevOps. Sometimes the Dev gets emphasized more than the Ops when DevOps is implemented, but you can't omit the Ops."

The solution, according to Mullins, is to broaden day-to-day DevOps activities to include database administrators.

Mullins continues, "By deploying agile development with DBAs participating in teams along with the developers, you get increased cooperation and communication between the folks coding the application (that's the Dev) and the folks developing and managing the database (that's the Ops or DBA). This means that many DBAs are working in teams with developers instead of in teams of other DBAs, at least for periods when development projects are very active."

DevOps, along with Agile practices, is known to increase efficiency in the software development lifecycle. Including DBAs as day-to-day members of a software development team is probably one of the easiest ways to implement comprehensive change management into mainframe development.

3. Finding and reducing shadow IT

As mentioned above, the benefit of mainframes is that they're computing powerhouses. However, for many companies, they are also islands unto themselves. Both logic and data are contained in one very big box. While this might suit the company's needs at a global level, IT departments struggle with the isolation internally, particularly around data. Many times groups outside of mainframe development will address the issue by copying data out of the mainframe into other environments. This approach is called shadow IT.

Jim Porell, Director and Solutions Architect at Rocket Software and former Distinguished Engineer and Chief Architect for all mainframe software at IBM, describes the practice, "Shadow IT [is] the term we came up with [where] you've got the mainframe doing what it's doing, and you have this shadow IT [group], that's copying, what's done in the mainframe because these guys over here want independence and to control their own destiny."

In other words, shadow IT is a group that spends time copying data that resides on a mainframe over to another target, usually an x86-powered cloud installation. Their goal is to use the data for their own autonomous purposes.

At face value, shadow IT isn't an apparent problem. Porell points out, however, it's actually quite costly. "I have one customer [that] was spending $200 million a year on a mainframe, but was spending 100 million on distributed stuff… taking stuff off the mainframe. But they had a five-year projection of going to $1.3 billion on non-mainframe!"

Porell counsels his customers that an alternative to shadow IT is to share the data and leverage the best of all the available technologies. Porell said, "I'm not going to copy data. I'm going to share the data. So rather than make copies and independently manage up, I'm going to manage it all together and share resources around security, around resilience, around data and transaction programs. And I'm going to use the best of each of the technologies, instead of having independent silos. Large businesses can save hundreds of millions of dollars when they collaborate across their IT groups rather than manage them as silos."

Finding and reducing shadow IT is a good way to minimize costs without compromising operational efficiency.

4. Improving mainframe tool standardization for effective integration into DevOps deployment patterns

Integrating competing technologies can be a problem, especially when there are both technical and cultural disparities in play. This problem is particularly vexing for large banks. These enterprises have both legacy mainframes and emerging x86-based technologies to support. Their IT budgets can run into the hundreds of millions, if not billions of dollars.

Ralph Van Beek, DevOps Architect at Rabobank, knows the situation all too well. Rabobank is a Dutch multinational banking and financial services company headquartered in Utrecht, Netherlands. The bank had €590 billion in assets in 2019. Rabobank is a very big business. It is also a significant IBM customer.

Van Beek describes the situation this way, "IBM tries to balance open source and closed source strategies together. The Rational and UrbanCode products are both closed source tools from IBM. These products have different development histories and therefore a conflicting architectural setup.  What IBM does as well is that they, in parallel, aim for open source tools like GIT, VSCode, Eclipse and Jenkins. Tools that in many aspects have duplicity with the closed source tools and bring even more architectural conflicts to the work field. It can be quite a struggle to tie these all together."

According to Van Beek, the challenge of integrating all the products and technologies together is more difficult as automation becomes a more prominent part of CI/CD processes.

As Van Beek reports during his interview with Enable Architect, "If you want to automate a process, it should be formalized. It should be standardized. And then you need to have very good arrangements [about] how all the products in this flow interact with each other. If not ... you lose a lot of time in tying them together. And in the end, you will still have a sub-optimal, integrated user experience."

In Van Beek's opinion, it's a lack of standardization that slows down the mainframe's agility. His observation is that if businesses want fast deployments, they'd rather go with x86 installations. x86 platforms tend to adhere to general conventions based on open source standards. There's still a good deal of mainframe tooling that's proprietary. As Van Beek says, "The lack of a tooling suite that delivers a great agile delivery experience on the mainframe is one of the biggest inhibitors of the mainframe becoming yet as mainstream, again, as cloud currently is."

5. Avoiding misguided modernization efforts

Two types of modernization can take place in corporate IT. The first type of modernization is when new technologies address real shortcomings in a company's IT portfolio. The second type is when new technologies are nothing more than a rehash of capabilities that already exist in good working order on legacy systems. Addressing shortcomings in a cost-effective manner is good. Making change for the sake of making change rarely has benefits. And yet it happens, particularly in companies that have many older mainframe computers.

It's a real problem that Joe Winchester, Senior Technical Staff Member working for IBM in the UK and an active member of the Zowe technical community described in a recent interview with Enable Architect, "I've seen modernization efforts fail because they bring modern tools or languages to the mainframe without recognizing what's there already, and often try to get experienced mainframers to give up what they are familiar with and productive at using."

Redundant development activity is the bane of many IT departments. Many times it occurs not out of negligence but rather due to poor communication. When an IT department has hundreds of teams working on a diverse number of projects, it's hard to know when work is overlapping, even at the managerial level. Thus, effective communication is critical. The means of communication need to go beyond the scope of inter-office emails and weekly staff meetings. According to Winchester, a good way to promote effective communication is to participate in communities around the technologies a company uses.

Winchester points to his work on the Zowe project as an example of such a community. (Zowe is a collection of open source projects sponsored by the Linux Foundation under the auspices of the Open Mainframe Project.)

Winchester describes the work as more than building a technology, "We are building a community. Zowe has a Slack channel where developers and architects can talk directly to customers (and vice-versa). Zowe practices continuous integration and continuous delivery. Zowe has a conformance program that documents and underwrites good API [development]."

The wisdom of crowds is a documented phenomenon. A community can see things a few individuals can't. Putting its wisdom to use according to the mechanisms and practices found in open source development will go a long way toward avoiding the hazard of the misguided modernization efforts that often accompany mainframe development.

Putting it all together

When it comes to cool technologies in modern IT, mainframe computers are not at the top of the list. Their significance as part of a modern architecture is undeniable, however. For the companies that have them, mainframes will remain the foundation of a good portion of enterprise computing.

Many companies have realized that replacing a mainframe is not a cost-effective solution for addressing the shortcomings they may attribute to their presence. Instead, the path many companies are taking is to upgrade existing mainframe systems. It makes financial sense.

Evolution is also a valid option. Mainframes continue to evolve, as do the IT experts who work on them. For teams who are willing and able to do so, the work to change will come with challenges–always has, always will. The trick is to understand the most pressing challenges at hand and have a plan in place to address them. Hopefully, this article's expert opinions will give the right encouragement to move mainframe development forward with a bit more ease.

Author’s photo

Bob Reselman

Bob Reselman is a nationally known software developer, system architect, industry analyst, and technical writer/journalist. More about me

Navigate the shifting technology landscape. Read An architect's guide to multicloud infrastructure.

OUR BEST CONTENT, DELIVERED TO YOUR INBOX

Privacy Statement