Red Hat AI: What’s New and What’s Next – Red Hat Boosts Enterprise AI Across the Hybrid Cloud
AI portfolio adds enhancements to Red Hat OpenShift AI and Red Hat Enterprise Linux AI to help operationalize AI strategies.
Red Hat offers a portfolio of AI products and services under the Red Hat AI umbrella, designed to build, deploy, and manage AI and machine learning (ML) solutions across hybrid cloud environments.
These products emphasize open source principles, flexibility, and scalability, enabling enterprises to customize and deploy AI models efficiently.
In March they announced the latest updates to provide an enterprise AI platform for model training and inference that delivers increased efficiency, a simplified experience and the flexibility to deploy anywhere across a hybrid cloud environment.
“Red Hat knows that enterprises will need ways to manage the rising cost of their generative AI deployments, as they bring more use cases to production and run at scale.
They also need to address the challenge of integrating AI models with private enterprise data and be able to deploy these models wherever their data may live. Red Hat AI helps enterprises address these challenges by enabling them to leverage more efficient, purpose-built models, trained on their data and enable flexible inference across on-premises, cloud and edge environments.”
Joe Fernandes, VP, AI Business Unit, Red Hat
Even as businesses look for ways to reduce the costs of deploying large language models (LLMs) at scale to address a growing number of use cases, they are still faced with the challenge of integrating those models with their proprietary data that drives those use cases while also being able to access this data wherever it exists, whether in a data center, across public clouds or even at the edge.
Encompassing both Red Hat OpenShift AI and Red Hat Enterprise Linux AI (RHEL AI), Red Hat AI addresses these concerns by providing an enterprise AI platform that enables users to adopt more efficient and optimized models, tuned on business-specific data and that can then be deployed across the hybrid cloud for both training and inference on a wide-range of accelerated compute architectures.
Red Hat OpenShift AI
Red Hat OpenShift AI is a comprehensive platform for building, deploying, and managing predictive and generative AI models at scale across hybrid cloud environments (on-premises, public cloud, or edge).
It supports the entire AI/ML lifecycle, from experimentation to production, and is built on open source technologies with Kubernetes at its core.
The platform provides the functionality to build predictive models and tune gen AI models, along with tools to simplify AI model management, from data science and model pipelines and model monitoring to governance and more.
RHEL AI
Last year Red Hat President and CEO Matt Hicks and Vice President and General Manager Joe Fernandes announced Red Hat Enterprise Linux AI, a ground-breaking collaboration between Red Hat and IBM.
RHEL AI brings accessibility with a foundation model platform, driven by open source communities. RHEL AI brings together IBM’s open source Granite language models, the InstructLab community and open source tools, and an AI-optimized, bootable distribution of Red Hat Enterprise Linux that can be deployed across any environment.
With RHEL AI, enterprises can easily develop, test, and deploy Granite-based large language models, revolutionizing AI deployment in the enterprise. Watch this video to learn more about how RHEL AI’s enterprise-grade capabilities are tailored to lower the barrier to entry and help businesses get their AI projects into production smoothly.
Summary
Red Hat’s AI portfolio includes Red Hat OpenShift AI for comprehensive AI/ML lifecycle management, Red Hat Enterprise Linux AI for accessible LLM development, and Red Hat Lightspeed for AI-driven automation across hybrid cloud platforms. These products are designed to be flexible, scalable, and open source, supporting enterprises in industries like finance, healthcare, and education to innovate with AI while maintaining control over data and deployment.