Americas

  • United States
michael_cooney
Senior Editor

IBM takes a pragmatic approach to enterprise AI

News Analysis
Apr 18, 20235 mins
Cloud ComputingGenerative AI

Using artificial-intelligence 'foundation models' and its Watson natural-language analytics platform, IBM hopes to automate creation of scripts, modernize legacy systems, and accelerate scientific R&D.

artificial intelligence machine learning and modern computer and picture id1209989402 100903158 lar

When it comes to helping enterprises reap the potential benefits of AI, IBM has honed a well-learned, practical approach that differs from that used by many of its competitors.

“Our pragmatism is one of our important differentiators, too, because we know—through years of implementing and adapting AI capabilities for thousands of clients—that the journey to using the technology effectively is as important as the ultimate end-goal, especially for the mission-critical enterprises we work with,” said Tarun Chopra, vice president of IBM product management, Data and AI. 

That journey can include myriad issues such as determining the best use of the massive amounts of data available to large enterprises, perhaps integrating that data with cloud-based applications, and effectively applying the right AI models to get the best results.

“Our customers have millions and millions of dollars invested in existing systems, so they’re not going to go and build brand new siloed AI system,” Chopra said. “We have to figure out how we stitch all this together to work with a broader ecosystem.”

Another issue is being able to trust the data that comes out of AI systems, he said. Without being able to explain the inputs to and outputs from the systems, a highly regulated business like a bank won’t be able to pass basic ethics and regulatory committees. “Forget about putting them into production,”  Chopra said.

IBM’s approach to AI aims at a fundamental challenge: scalability. IBM is tackling this with what it calls foundation models: flexible, reusable models that underpin and fuel different AI techniques and applications, Chopra said.

For example, Open AI’s ChatGPT is a foundation model that, through generative AI, addresses language tasks, Chopra said. Microsoft is introducing aspects of ChatGPT into its products, and others may follow. “But the key will be is how to take some of those fundamentals that they have worked on and enable them for enterprise customers in a much more usable way,” he said. “It’s going to be at scale where it’s going be the challenge.”

IBM Watson includes AI

IBM has deployed foundation-model techniques in its IBM Watson Natural Language Processing (NLP) stack, and the company is working to commercialize additional offerings that would go beyond language.

IBM’s 12-year old Watson includes an Embeddable AI portfolio and is a core part of the IBM AI strategy.   

According to a recent NASDAQ report, “IBM’s Watson has evolved and is deployed for many business use cases. It’s being applied for customer service, supply chain, financial planning, risk and compliance, advertising, IT, video and security at scale.  IBM was ranked #1 by IDC for AI lifecycle software market share in February 2022 and IBM proclaims that 70% of global banks and 13 of the top 14 systems integrators use Watson and that it has over 100 million users of its AI.”

Other examples of IBM AI work include:

  • In 2021, IBM Research released Project Wisdom in collaboration with Red Hat, which offers an AI foundation model plus generative AI capability to automatically generate code for developers on Red Hat Ansible through a natural-language interface.Those scripts can automate cloud networks, for example, simplifying cloud management, Chopra said.
  • IBM Research has created a foundation model for IT operations and management that flags impending crashes and creates coding commands to head them off.
  • IBM is looking to use foundational models based on its CodeNet dataset of popular coding languages automate and modernize business processes. With these models, legacy systems might be enhanced with the capability to use aspects of the modern web, and applications might update themselves with little need for human oversight, IBM stated.
  • IBM Research said this year that would partner with NASA to build a domain-specific foundation model trained on earth science literature to help scientists utilize up-to-date mission data and derive insights easily from research that would otherwise be challenging for them to read and internalize.
  • Also this year, IBM released a foundation model leveraging generative AI called MolFormer that can help inform the creation of net new molecules, streamlining the creation of new materials including drugs.

AI in IBM mainframes

IBM has integrated AI with its mainframes. The newest z16 Big Iron boasts an AI accelerator built onto its core Telum processor that can do 300 billion deep-learning inferences per day with one millisecond latency, according to IBM.  The latest version of its z/OS operating system will include a new AI Framework for system operations to optimize IT processes, simplify management, improve performance, and reduce skill requirements. The new version will also support technologies to deploy AI workloads co-located with z/OS applications and will feature improved cloud capabilities. 

IBM said AI-powered workload management will intelligently predict upcoming workloads and react by allocating an appropriate number of batch runs, thus eliminating manual fine-tuning and trial-and-error approaches.

“Systems are getting more and more complex, so we want to simply operations through with AI and automation by bringing a very prescriptive solution to our clients that will give them value out of the box and then much more,” Chopra said. “The ongoing work with the z/OS systems is just another example of how we will help clients deploy AI models into their core mission-critical workloads.”