BMC Tightens Integration Across DevOps Tools for Mainframes

Content By Devops .com

BMC has extended its BMC Automated Mainframe Intelligence (AMI) and BMC Compuware portfolios as part of an ongoing effort to make mainframes just another platform for DevOps teams to build and deploy applications on.

John McKenny, senior vice president for strategy and innovation for ZSolutions at BMC, said the DevOps technologies that the company gained last year with the acquisition of Compuware have now been fully integrated with the rest of the BMC mainframe portfolio.

The BMC Compuware tools now also have been integrated with Visual Studio Code, a free tool from Microsoft that is employed widely by developers, he added.

The BMC Compuware ISPW solution integrates with Visual Studio Code to enable developers to compile changed code on the mainframe with a right-click.

In addition, the BMC Compuware Topaz platform for applying DevOps practices to the development of mainframe applications has a new command-line interface (CLI) along with support for multiple application programming interfaces.

A BMC Compuware File-AID 21 tool now makes it possible to reduce the time required to search billions of transactions and log files on a mainframe, while the BMC AMI Security tool has been integrated with the BMC Helix IT service management (ITSM) platform.

Finally, the BMC AMI Ops Insight tool has been updated to provide diagnostic capabilities to more easily identify the probable cause of an issue, while the BMC AMI SQL Performance for DB2 tool now includes a plug-in for the open source Jenkins continuous integration/continuous delivery (CI/CD) platform.

McKenney said the goal is to integrate mainframes with any DevOps workflow as part of an effort to either deploy new applications or modernize legacy applications already running on the platform. Modernization of mainframe applications is being driven by the need to add application programming interfaces (APIs) to mainframe applications that need to be integrated with a much broader array of distributed applications, he noted.

Much of the demand for those APIs is being driven by digital business transformation initiatives that require applications to be able to access backend transaction processing and analytics applications running on mainframes, added McKenney.

It’s not clear at what rate IT teams that build and deploy applications on mainframes are moving away from waterfall-based processes to embrace best DevOps practices. However, McKenney noted, the rate of changes being made to mainframe applications has accelerated significantly over the last few years.

While large numbers of applications have been migrated off mainframes over the last decade, the number of enterprise IT organizations that continue to rely on the venerable platform to process transactions remains considerable. Many organizations aren’t convinced that it’s worth rewriting the applications they have relied on for decades when it’s easier to make them more accessible to other applications by adding an API. In effect, McKenney noted, mainframe applications are becoming just another microservice that can be invoked by a developer.

Regardless of the motivation, it’s clear mainframe applications will continue to run well into the next decade. The challenge IT teams now face is finding a way to make those mainframes just another platform on which to deploy applications at the end of a DevOps toolchain.

Leave a Reply

Your email address will not be published. Required fields are marked *