In August of 2021, IBM announced Telum, the next-generation processor for IBM Z. This early announcement of Z chip technology provided a great opportunity to showcase new innovations coming to the mainframe world, which includes technology designed to reinforce IBM Z’s position as a highly resilient, scalable, and securable world-class platform. This new AI-powered technology is an innovation that will bring incredible value to workloads and data running on Z systems.
One of the key features of Telum is its dedicated on-chip inference accelerator. The accelerator is designed to deliver AI inference in real time, at a scale that makes it a perfect match for the high-volume, transactional workloads that typically run on Z. AI use-cases integrated into these workloads benefit immensely from proximity to the transactions and data they relate to. The Telum AI accelerator optimizes these even further to ensure critical service level agreements, or SLAs, are met. By doing this, the accelerator also unlocks the possibility of using more complex models to deliver insights at a higher degree of accuracy.
Innovations like this are key to strengthening IBM Z’s position as a world-class platform for artificial intelligence. You, as SHARE’d Intelligence readers, know the immense volume of data and the mission-critical nature of the workloads that run on the IBM Z. These factors present a massive opportunity for enterprise clients to utilize AI for a variety of use-cases to derive additional value from IBM Z systems and data.
What does AI on Z mean for you?
The Telum AI accelerator was designed because IBM continues to recognize the importance of AI in the enterprise. AI is simply impacting every industry, and IBM Z workloads are certainly affected by this.
For those in a traditional IT role focused on Z, such as a systems programmer, IT architect, or application architect, AI use-cases may be similar to other use-cases and technology that interact with core mainframe workloads. The challenge a new and complex area like AI brings, then, is understanding what you need to know and how to position yourself for success. For example, you may have to deploy programs that run AI models, manage them and the resources they consume, and call the services they provide. This will certainly involve the need to learn and understand these new applications and concepts; however, it typically does not mean tasks such as creating and training a deep learning or machine learning model for a use-case (this is the role of a data scientist).
Even more than most projects, a key factor for success on AI projects is ensuring that you are not siloed. Many struggle to bring AI use-cases to production, so an early understanding of the use-case, data requirements, and similar project aspects can help pave the way for success. This means understanding the use-case and engaging with data scientists and engineers to ensure critical business requirements are met when AI is deployed to production.
Getting started with AI on IBM Z
Getting started with an unfamiliar technology can be a challenge. Thankfully, in the case of AI, there are a great deal of materials available to help!
One good first step is to understand what you can do with AI on Z — this includes the strategy IBM is pursuing for the platform, the capabilities, and the available software. There is already so much that you can do with AI on Z, even before IBM Telum is made available in the NextGen Z machine. It’s quite easy to get your hands on some of this technology now in order to begin the learning process.
Here are some great resources to start with:
All of these resources will be updated regularly, so check back often for the latest, most-up-to-date information.
Additionally, there is a wealth of knowledge, tutorials, and guidance for AI in the open-source community so that you can start learning about AI in general.
In addition to exploring the materials outlined in this article, IBM has created an AI on Z Discovery Garage that is free of charge to IBM Z and LinuxONE clients. The garage can be tailored to your individual needs and has the following goals:
- AI on Z technology discovery.
- Understand how an AI project matures from pilot stage to production, and the role IBM Z can play.
- Understand real-world applications of AI to enterprise workloads.
- Learn how to leverage zCX to bring the open-source AI ecosystem to z/OS.
- Hands-on labs to get familiar with AI technologies and learn how they can integrate with z/OS workloads.
- Use-case exploration.
If you are interested in participating in the IBM AI on Z Garage, or want to engage with AI on Z experts, please reach out to aionz@us.ibm.com.
Andrew Sica is a senior technical staff member, leading AI development, for IBM Z Systems. Andrew has worked on various initiatives across Z in his 21 years with IBM, ranging from z/OS development through efforts such as Tailored Fit Pricing for IBM Z. In his current role, Andrew is focused on delivering next-generation capabilities to IBM clients to accelerate their journey to adopting AI on Z.