A RedMonk Conversation: Generative AI for Application Modernization on System z

Share via Twitter Share via Facebook Share via Linkedin Share via Reddit

Get more video from Redmonk, Subscribe!

In this RedMonk Conversation with Keri Olson, IBM VP of AI for code, we discuss IBM’s approach to generative AI, focusing on domain specific use cases, such as AI-driven application modernization for IBM’s Z Mainframe platform with IBM watsonx Code Assistant for Z. IBM is training dedicated models for these use cases, to give customers something they won’t find with competitors’ models. For example COBOL to Java conversion, with models trained to support all of IBM’s mainframe subsystems such as CICS. Olson explains why IBM will allow customers to use AI for specific use cases while maintaining data privacy, security and customer trust. IBM also offers IBM watsonx Code Assistant for Red Hat Ansible Lightspeed, to enable operators to be more productive in using Ansible infrastructure as code for large scale systems.

This was a RedMonk video, sponsored by IBM.

Rather listen to this conversation as a podcast?



James Governor: Hey, everybody, it’s James Governor here, analyst at RedMonk. I’m here today with Keri Olson, VP of AI at IBM. And we’re going to be chatting a little bit about LLMs, the use of AI to improve software delivery. And that’s a context that I think almost everyone in the industry is thinking about right now. So first of all, I’d like to say, Keri, welcome. Thanks for joining us.

Keri Olson: Thank you very much. Great to be here.

James: Great. And so, yeah, I think that from a context perspective, we’ve had this big leap forward as an industry. AI is something that we’ve been talking about as an industry for the longest time. It has had some sort of false starts probably if we think about companies that are still around that have been working on AI for the longest time. I suspect that IBM may well win that award. Natural language processing has been one of the longest running pieces of research at IBM in the form of speech recognition. A lot there. But the current generation that we’re talking about was really kicked off with the industry basically going gaga when chatGPT dropped. So I think we’ve all been in somewhat responding mode. Pretty clearly as an industry, we’re trying to understand exactly what are the best use cases, where is AI going to be most effective. One of the places that RedMonk has seen and that a lot of people in the industry have seen, as I say, is about software development. People are able to do things more quickly than before, and perhaps they’re able to do some things that they haven’t been able to do before when they’re working in a domain that they don’t understand as well, as long as they’ve still got the skills actually to move forward.

So I guess in this, there’s now literally so much progress in this space on a daily basis. We’re seeing new models drop, we’re seeing a lot of change and so on. So a couple of questions, Keri. A, what’s IBM’s play here? And B, what’s your role in navigating that change?

Keri: Great. Thank you very much, James. And as you said, this space is moving so quickly, really at unprecedented rates. And there is a keen focus on AI as well as AI for code. So my team is very specifically focused on AI for code and bringing code assistance to market to help organizations in the software development life cycle.

But let me back up and talk a little bit about IBM’s approach to generative AI because I do think that it’s unique and different. So IBM has built an AI and data platform that includes three key components that are WatsonX.AI, Watsonx.data, and Watsonx.governance. Now, on top of this platform, we also bring purpose built AI assistance to the market. And all of this together is really helping organizations to accelerate and scale the impact of AI across their organization. And they’re turning to Watson X because of a few reasons. Number one, IBM’s approach to AI is open. It’s targeted, it’s trusted, and it’s empowering.

So let’s talk first about our concept of open. Watson X is based on open technologies. The base of our product is Red Hat Openshift and Openshift.AI. And it’s important that we have that basis for our platform, as it really provides a variety of ways that customers can bring models to work in their enterprise, and it helps them to deliver use cases with compliance in mind, with trust and transparency in mind. So as we start talking about trust and transparency, our product and our platform is built with trust, transparency, responsibility and governance in mind, so that organizations can truly manage their legal, regulatory and compliance requirements. And we do all of that in a purpose built approach. So we are targeting very specific domains, domains like human resources, domains like customer service, it operations, and of course, software development, which we’re going to talk about quite a bit today.

The last thing that I’ll say about IBM’s approach with our AI and data platform is that it’s very empowering. So not only does it allow organizations and users within organizations to utilize AI, to be consumers of AI, but it actually allows these organizations to start being AI value creators by giving them the tools that they need to create their own AI concepts and projects within their enterprise. So IBM’s approach is differentiated and it’s unique and it’s ultimately bringing a ton of value to our customers.

James: Okay, so, I mean open. I mean, pretty clearly this is nothing new from an IBM playbook perspective in terms of sort of… IBM has been doing open source since Linux was a thing, the openness thing. And obviously you work so much in regulated industries, it’s no surprises that compliance would be front and center. Just one question there in terms of your customer base. So much of what we hear about is the model training and so on. And these ideas have all got to be lots of big GPUs in the cloud, yet you probably got customers that are saying, hang on, we want to run, perhaps the inference, we want to run something on prem. So what does the balance look like there? And is there an aspect that needs to be considered in terms of trust? Because perhaps people are a little bit cautious about just shoveling all of their data into, well, frankly, into OpenAI.

Keri: Yes, I think that is a concern for customers as customers are really looking to utilize generative AI for their core capabilities and their core business processes. So some customers truly are looking for that on premises solution. And the great news for them is that that’s something that IBM can provide. So we provide our platform both as SaaS, so software as a service, and we provide it in the customer’s own premise, in their own data center, so they have the choice to use it as they wish based on the needs of their business. And the great thing there, James, is that not only do we provide the platform, but our assistants are going to become available on prem as well. So the customers will have a choice of using SaaS or on prem for their assistance. And the models. So IBM has proprietary models. You’ve talked about the large language models. We do have models that we have built, but also we support a wide range of open source models on our platform. So customers truly have a choice of what they want to use based on the needs of their business and based on the choice that makes the best sense for them.

James: Okay, so let me ask you a question then, from a product delivery perspective. You’re obviously working on these coding assistants. I think RedMonk has already looked at some of these. I think certainly the, what’s next for Ansible? Really interesting. Not everybody knows Ansible as well, so that’s an interesting use case. We’re effectively generating a DSL and improving the automation experience there. But I guess one of the other areas, and frankly, having spent as much of my career as I have kind of looking at mainframe technology, would be the platform support code assistant for Z. Obviously Z skills are at a premium and yeah, tell me a bit about what you’re working on and delivering in market to help your customers.

Keri: Absolutely. So this is something that IBM is really excited about, and you’ll continue to see our focus on the AI for code and AI code assistant space. So we first announced the Watson X code assistant family at our Think conference in 2023, and since then we have delivered two new products. Watson Xcode assistant for Red Hat, Ansible Lightspeed, which you already spoke about, and Watson Xcode assistant for Z. Now, as you look at these two specific products, we are building purpose built solutions that solve real world problems for our customers. So we’re really focused on helping to enhance developer and IT operators productivity by taking these specific use cases and building specific foundation models that support them. So IBM has been focused on building our AI for code or coding large language models for many years. But as we looked to bring WCA for Red Hat Ansible Lightspeed and WCA for z to market, what we did is we took those large language models for code and we trained them very specifically for the use cases. So bringing the Ansible content and context into the foundation model for the Ansible use case and bringing COBOL and Java into the use case for WCA for Z.

So with Watson X code assistant developers and IT operators are really getting a purpose built solution that helps them to be more productive. Now, if we go into WCA for Z specifically, which you asked about, I can tell you a little bit more about what we are doing there. Now, as I mentioned, we’ve invested in our large language models for many years. What we did is we took our 20 billion parameter large language model for code and we’ve infused that with lots and lots of COBOL data, COBOL programming code and Java. And essentially with the expertise we have in IBM and the lines of code that we have in IBM related to COBOL and Java, we’re able to fine tune that model to bring capabilities that are very, very specific to that use case. Now that’s only the start of it, because if you think about what we’re doing with WCA for Z, Watson xcode assistant for Z, it’s all about application modernization for the mainframe. So while we’re starting with this concept of take your legacy COBOL application and transform it to a more modern Java application, that’s really not the whole story. The whole story is that we are providing application modernization capabilities across the entire lifecycle.

So it starts with helping an organization to really understand that existing application that they have, understand the application, understand the environment that it’s working in, understand the subsystems that’s supporting it. So we build capabilities into this end to end solution that first help the user to understand the application that they’re working with. Secondly, we provide a refactoring capability. The refactoring capability helps organizations to break down a monolithic application into a smaller set of business services that they can then understand on a function by function level to understand what the application is doing and what it’s used for and start understanding that COBOL application as it is. Now, once you take it and refactor it and have essentially smaller chunks of the application, then what you can do is you can selectively start transforming the COBOL to Java. And then that ending Java is actually working with the existing COBOL so the fact that we’re transforming pieces of the application, that’s okay because the entire subsystem is working together. The entire application is working together based on the transformation that we have done. So it’s not a simple let’s translate your code. It’s like let’s understand your application, let’s break it down to business services. And then we can selectively transform COBOL to Java.

And in the end we then run, we then help organizations to run a set of tests that they can then determine that the ending Java code is doing what it’s supposed to do in the context of the overall environment. And they’re able to run those applications as they were before. Now this is really important because there are a lot of organizations out there that are running their critical, mission, critical applications in COBOL today, and it’s becoming harder and harder to find COBOL production.

James: Absolutely.

Keri: Right?

James: Yeah, that is a huge issue.

Keri: Yes. So there’s definitely a skills gap out there that we’re helping to address in helping organizations to modernize some of those applications.

James: Talking about modernization. Tell me about like the, like the IDE. I mean, is this modern technology? Can I use VS code? Like, what does it look like from a developer experience perspective?

Keri: Absolutely. So the developer experience is obviously a very important part of any coding assistant. And so for WCA for z as well as WCA for Red Hat Ansible Lightspeed, we do support VS code. So users that are using our products today can do that right in their coding environment in terms of doing that transformation or that coding that they’re looking to do.

James: Awesome. Awesome. So I guess last question then, before we wrap up. So what’s next? I mean, you said you’ve taken this very specific use case based approach. I’ve talked about how quickly things are moving. That does make planning more difficult, frankly. But yeah, what do you think’s next for the industry and what’s next for you from a product management perspective?

Keri: As you mentioned, James, things are moving so quickly in this space. So I expect we will continue to see a lot of movement in the industry, a lot of movement in the market. As far as IBM goes, first of all, I’ll say that everyone should look out for some new announcements coming at IBM Think 2024. IBM coding assistance will definitely be front and center there and it’s just a couple of short months away, so we have that to look forward to. But I’ll say that customers can really expect IBM to continue down this path of purpose built solutions. So we’ll continue to build additional capabilities into the two products that we’ve already brought to market. We will continue to evolve and mature those and bring new capabilities to those customers. We will also look at bringing new solutions to market that are purpose built and really focused on solving real world problems that our customers are sharing with us on a day to day basis. So the Watson X code assistant family of products is an opportunity to help businesses and technical folks to really modernize their environments by accelerating application modernization. So I think you’ll continue to see more of that in the Watson X code assistant for z space, as well as forging into other areas of application modernization as well. Ultimately, our main goal is to help organizations to increase developer productivity by using generative AI, and that’s what it’s all about.

James: Great. That’s fantastic. Yeah, we’ve all got a lot of work to do. There’s no doubt about that. You’re not going to be talking to any customers that don’t have a backlog of changes that need to be made. So yeah, in terms of the developer experience and all that, a couple of things. We are going to be doing a follow up show where we’re actually going to get a demo. We’ll be able to see this refactoring code gen approach in action. So stay tuned. We’ll be dropping that video soon. That really just leaves us to say today, thank you very much, Keri, for joining us. Thanks everyone for joining us. And it’ll be interesting to see how these products do in market because clearly, as I say, there is a demand and a need given the skills gaps in 2024. So Keri, thank you so much.

Keri: Thank you, James. Have a great day.

James: You too.


More in this series

Conversations (76)