AI-driven infrastructure: Boosting efficiency and innovation in DevOps – SiliconANGLE

AI-driven infrastructure: Boosting efficiency and innovation in DevOps – SiliconANGLE

The push to accomplish more with less is stronger than ever in today’s enterprise world. This challenge is driving the adoption of AI-driven infrastructure.

Many organizations are not equipped to deploy large language models on their own, leading to the rise of turnkey solutions such as GPT-in-a-Box. These innovations enable the integration of LLMs, generative AI applications and machine learning operations across various environments, transforming how businesses utilize technology, according to Debojyoti (Debo) Dutta (pictured, left), vice president of engineering (AI) at Nutanix Inc.

Nutanix’s Debojyoti Dutta talks to theCUBE about AI-driven infrastructure.

“When people think of Nutanix, they think of us as private cloud architecture …we are morphing from hyperconverge to enabling basically cloud-native applications, including AI and generative AI,” Dutta said. “Everybody wants to run their customer support bot automatically. They want to use AI to run faster, whether it’s code generation, whether it’s productivity, whether it’s SRE, DevOps. That’s where we come in and we have a thing called GPT-in-a-Box, a solution that helps customers run generative AI applications really quickly and efficiently with the governance and data protection.”

Dutta, Shaked Askayo (center), co-founder and chief technology officer of Kubiya Inc., and Amit Eyal Govrin (right), founder and chief executive officer of Kubiya Inc., spoke with theCUBE Research’s  Savannah Peterson and Rob Strechay at the Supercloud 7: Get Ready for the Next Data Platform event, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed the need for AI-driven infrastructure in the present enterprise world.

Integrating AI-driven infrastructure into DevOps

Combining the worlds of development and operations is still a tall order, and this is why DevOps continues to be a headache. Nevertheless, AI-driven infrastructure is proving to be a perfect remedy since it tames the complexity through LLMs, ushering in a new DevOps era, according to Askayo. 

“It’s still hard to adapt CI/CD and also infrastructure as code,” he stated. “We’re now just entering another phase in this revolution. This phase is now enabled because of the fact that large language models allow us to take off the abstraction, the complexity layer of knowing how to get to these automations, understanding these CI/CD processes or these kinds of collaborative processes. Something that is missing to anyone in CI/CD could be as simple as notifications.”

Since an AI-driven infrastructure has the capability of abstracting away the complexity, LLM integration is of the essence. As a result, delving deeper into how the LLMs will run and how they will live in the existing framework is of the essence, Askayo pointed out. 

“When trying to adapt LLM-based processes, which are essentially by their nature DevOps, there’s a lot of ways to get such processes up and running,” he said. “How do I deal with secrets? How do I deal with serverless orchestration? All of these complexities are something that we learned right from the beginning, even before trying to solve the problem of LLM joining the kitchen. We allow you to essentially define your LLM complex processes, DevOps processes as code concepts.”

To enhance developers’ productivity, delegation has become the new automation. This is because it helps in tackling mundane tasks since at the end of the day it boils down to output versus input, according to Govrin.

“It’s no longer simply enough to have automation as a kind of a steady state,” he noted. “There’s a paradox called the time to automation paradox. Now, that loosely states the time and effort it takes you to configure a file and so forth. The outcome you’re trying to achieve is to take this role or take this task off my plate, put it in somebody else’s shoes. That’s the developer experience we’re looking for …where the form factor changes from automation to delegation.”

Enhancing AI safety and compliance

Since AI models are being used in all walks of life, including delicate areas, such as healthcare, safety is fundamental. As a result, MLCommons fits into the picture through its open-collaboration philosophy, which accelerates AI innovation in a secure manner, Dutta pointed out. 

“MLCommons just announced their big AI safety initiative,” he said. “That’s one of the biggest milestones, a shift that’s happening. MLCommons started off as an infrastructure, benchmarking organization. Then it evolved to the best practices and building new data sets. Now it’s evolving again to being the trusted partner for AI safety. You don’t want models to kind of violate any regulations like NIST in the U.S. and we have similar ones from EU, Singapore and the U.K.”

As the skills gap continues to bite the AI field, Kubiya and Nutanix are taming this pain point through an AI-driven infrastructure. This is achieved through a two-pronged strategy, whereby enterprises enjoy a simplified infrastructure whereas the developer ecosystem benefits from open-source solutions., according to Dutta. 

“We are doing this in two different ways,” he explained. “One is we want to first help our customers run models very efficiently because it’s part of the MLOps lifecycle for them. Our GPT-in-a-Box will help you run large language models with connectors to Hugging Face Hub and Nvidia NIMS, so that customers can then run the model of their choice. 

For AI-driven infrastructure to be effective, holistic thinking is crucial. This entails a comprehensive view of how organizations deploy AI and data across multiple clouds, making data governance important, Dutta stated.

“I think as enterprises want to use AI to make them more efficient, they need to think holistically about their AI strategy, that MLOps and DevOps strategy,” he explained. “I mean, gluing this together is basically data. People have to be very careful about data governance and their model governance; it’s a two-prong thing.”

Stay tuned for the complete video interview, part of SiliconANGLE’s and theCUBE Research’s coverage of the Supercloud 7: Get Ready for the Next Data Platform event.

Photo: SiliconANGLE

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU

Article Source
https://siliconangle.com/2024/08/14/ai-driven-infrastructure-supercloud7/