As presented by Laura Jordana, Nutanix Enterprise AI (NAI) is designed to simplify the process of deploying and managing AI models for IT administrators and developers. The presentation begins by demonstrating the NAI interface, a Kubernetes application deployable on various platforms. The primary use case highlighted is enabling IT admins to provide developers with easy access to LLMs by connecting to external model repositories and creating secure endpoints. This allows developers to build and deploy AI workflows while keeping data within the organization's control.
The demo showcases the dashboard, which offers insights into active endpoints, request metrics, and infrastructure health. This view is crucial for IT admins to monitor model usage and impact on resources. The process involves importing models from various hubs like Hugging Face and creating endpoints that serve as the inference engine connection. The presenter emphasized the simplicity of this process, with much of the configuration pre-filled to ease the admin workload. They also highlighted the platform's OpenAI compatibility, allowing integration with existing tools.
While focusing on inferencing, not model training, the platform provides a secure and streamlined way to deploy and manage models within the organization's infrastructure. The key takeaway from the presentation is the simplification of AI model deployment, focusing on day 2 operations and ease of use. The platform leverages Kubernetes' ability to run on Nutanix, EKS, and other cloud instances. It also provides API access and monitoring capabilities for IT admins, and easy access to LLMs for AI developers.
Presented by Laura Jordana, Director, Technical Marketing, Nutanix. Recorded live in Santa Clara, California, on April 24, 2025, as part of AI Infrastructure Field Day. Watch the entire presentation at https://techfieldday.com/appearance/nutanix-presents-at-ai-infrastructure-field-day-2/ or https://techfieldday.com/event/aiifd2/ for more information.
Up Next in AI Infrastructure Field Day 2
-
AI Inferencing Sizing Considerations ...
Jesse Gonzales, Staff Solution Architect, offers sizing guidance for AI inferencing based on real-world experience. The presentation focuses on the critical aspect of appropriately sizing AI infrastructure, particularly for inferencing workloads. Gonzales emphasized the need to understand model r...
-
Wrapping up and summarizing Nutanix E...
The Nutanix presentation at AI Infrastructure Field Day focused on enterprise AI solutions, emphasizing giving customers a solid technical understanding of Nutanix Enterprise AI (NAI) and its role in addressing key customer challenges. The discussion highlighted the curated model catalog, offerin...
-
Design, deploy, and monitor networks ...
Thomas Scheibe, Chief Product Officer, offers solutions for designing, deploying, and monitoring networks for AI workloads. Their focus is on addressing the specialized networking needs of AI, including multiple networks, differentiated Quality of Service (QoS), and the integration of compute int...