Slide 1

Slide 1 text

AI DevOps Workshop: Using LLMs to Deploy Kubernetes Helm Charts Chris Gruel Senior Solution Architect, Akeyless

Slide 2

Slide 2 text

Agenda The Challenge 01 Our Goal 02 The How 03 Tool Selection: Langchain, LangGraph & Chainlit 04 Integrating Human Insights 05 Developing Conversational Agents 06 Demo: Agentic Installer in Action 07 Lessons Learned 08 Future Directions 09 Q&A 10

Slide 3

Slide 3 text

The Challenge: Helm Chart Dependencies and the Akeyless Gateway ● What is the Gateway? ● Deployment requirements: ● Connectivity ● Identify cloud provider ● Create appropriate auth method ● Create the gateway! ● Currently, manual process Applications and platforms Gateway CUSTOMER ENVIRONMENT CUSTOMER ENVIRONMENT TLC Gateway

Slide 4

Slide 4 text

Our Goal: Use Function-Calling in LLMs to create Agentic Behavior What is agentic behavior? ● Agent = autonomous entity that can perform tasks on its own or with human supervision ● Agentic behavior = the agent does the work for us!

Slide 5

Slide 5 text

The How 1. Create the OpenAI agent as an expert-pair programmer 2. Use agent for code structure of the code 3. Use Cursor IDE to fill out the details

Slide 6

Slide 6 text

Tool Selection: Langchain and LangGraph ● Langchain: built to abstract LLM complexities, and for connecting to data sources ● Langchain Benefits ● Abstraction ● Flexibility ● Monitoring ● LangGraph: for cyclical workflows ● => Create Human-in-the-Loop for chat interface

Slide 7

Slide 7 text

Choosing Chainlit for UI ● Streamlit: difficult to get full interactivity for chat ● Chainlit is built for this specific use case ● Lesson: It’s a fast-moving industry. If there’s a better solution, switch!

Slide 8

Slide 8 text

Integrating Human Insights Give the agent the ability to ask questions: 1. Gather information 2. Handle exceptions 3. Give us aggregated data based on its questions and the answers it receives

Slide 9

Slide 9 text

Developing Conversational Agents 1. Build asynchronous graphs: list of tools at LLM’s disposal 2. Purpose-built functions – minimize complexities for agent 3. Akeyless SDK is easy to use, but don’t want agent to worry about token 4. Minimize number of inputs for agent 5. We don’t want to send a token to OpenAI: the LLM model shouldn’t have actual credentials

Slide 10

Slide 10 text

Demo: The Agentic Installer in Action! https://github.com/akeyless-community/heimdal

Slide 11

Slide 11 text

Lessons Learned 1. Embrace an AI-powered IDE 2. Build an OpenAI assistant, load the relevant knowledge and use the assistant to output pseudo-code as a start, to be filled out with Cursor IDE. Benefit from the better context information built into the IDE itself! 3. Don’t waste time on a dead end not suited for purpose!

Slide 12

Slide 12 text

Future Directions 1. Deploy K8s Secret Injector 2. Detect if external secrets operator already in cluster >> install integration 3. Detect if Nginx ingress is installed in cluster, and offer to deploy with Nginx ingress 4. Self-service POC: deploy gateway, database, set up secrets… 5. Incorporate ALL public Akeyless documentation and enable RAG for instant support help

Slide 13

Slide 13 text

Thank you.