Cloudera acquires Octopai's platform to enhance metadata management capabilities

Read the press release

Cloudera has added several new innovations for building and deploying Generative AI applications –  everything from new project kickstarters called Accelerators for Machine Learning Projects (AMPs) to innovative updates from NVIDIA that streamline large-scale AI model deployment and management to real-time streaming pipelines that bring your AI game into the fast lane.

In this session, we will explore these three areas of innovation, and why each of them is good news for your AI projects.   You will learn how:

  • Our AMPs get you from concept to production up to 80% faster and connect you with our user community to ensure that you can access the latest and greatest ways to leverage AI in your enterprise.
  • Cloudera’s upcoming AI Inference service, powered by the NVIDIA NeMo Inference Microservice (NIM), streamlines the deployment and management of large-scale AI models, delivering up to 36x faster performance.
  • AI use cases like fraud prevention, supply chain optimization, personalized offers, and many more benefit from data in real-time, and how we make it super easy to augment your AI initiatives with real-time data pipelines. 

Your form submission has failed.

This may have been caused by one of the following:

  • Your request timed out
  • A plugin/browser extension blocked the submission. If you have an ad blocking plugin please disable it and close this message to reload the page.