Slide 1

Slide 1 text

AG-UI & MCP Standardizing Agentic Web Interfaces c't Max Marschall @MaxOSchulte Consultant / Architect @ Thinktecture

Slide 2

Slide 2 text

Model Context Protocol (MCP) In this Session The Agent–User Interaction (AG-UI) Protocol

Slide 3

Slide 3 text

The Model Context Protocol Think of MCP like a USB-C port for AI applications. [...] MCP provides a standardized way to connect AI applications to external systems. https://modelcontextprotocol.io/docs/getting-started/intro

Slide 4

Slide 4 text

HOST • LLM applications (Claude Desktop, IDEs) that initiate connections The Model Context Protocol CLIENT • 1:1 connection handler inside host, manages protocol communication SERVER • Provides context, tools, and capabilities to clients • Servers can't talk to each other

Slide 5

Slide 5 text

• Do not repeat your self • (KISS) • No standardization across implementations • Fragile when systems change Why should we care?

Slide 6

Slide 6 text

M x N ➡ M+N App 1 App 3 App 2 Database Documentation FAQ App 1 App 3 App 2 Database Documentation FAQ MCP M AI Applications N Tools, Ressources, Prompts ➡

Slide 7

Slide 7 text

• Build once, connect everywhere • Open-source and vendor-neutral • Based on proven patterns The MCP Idea 💡

Slide 8

Slide 8 text

The Model Context Proctocol

Slide 9

Slide 9 text

The Model Context Proctocol

Slide 10

Slide 10 text

The Model Context Proctocol

Slide 11

Slide 11 text

• Tools • Ressources • Prompts • Elicitation • Sampling • Server Composition • Progress Report Features More to come (e.g. MCP-UI)

Slide 12

Slide 12 text

... is the ability to connect a large language model (LLM) to reliable sources of information to ensure accurate and relevant results. • Improves relevance • Prevents hallucinations • Connects to real-world • Alternatives: Retrievel-Augmented Generation (RAG) & Fine-Tuning Grounding

Slide 13

Slide 13 text

... is to provide an automated way to add concise contextual information based on the changing requirements. Examples: • Web search / web access • Documentation access Context-Hydration

Slide 14

Slide 14 text

https://modelcontextprotocol.info/docs/clients/ Feature Support Missing features Clients from MCP proposer

Slide 15

Slide 15 text

• Limit your tools • Scope your tools • Tool compatibility • Feature compatibility Best Practices

Slide 16

Slide 16 text

• Different tools by authentication / authorization • Update tool lists depending on preceding queries • Agent Skill enable / disable MCPs • Tool change (command) Context Size Solution: Dynamic Tools

Slide 17

Slide 17 text

• Async-Tools • MCP-UI • Code execution • (Antrophic / Claude) Agent Skills + MCP = RAG-MCP. Not “more tools,” but a better relationship to tools Outlook

Slide 18

Slide 18 text

Model Context Protocol (MCP) In this Session The Agent–User Interaction (AG-UI) Protocol

Slide 19

Slide 19 text

The Agent–User Interaction (AG-UI) Protocol AG-UI is an open, lightweight, event-based protocol that standardizes how AI agents connect to user-facing applications. https://docs.ag-ui.com/introduction

Slide 20

Slide 20 text

The Full Protocol Stack

Slide 21

Slide 21 text

• MCP connects agents to tools & context • A2A enables agent-to-agent communication • No standard for agent-to-user interaction • Each framework invented custom protocols Missing Link

Slide 22

Slide 22 text

Standardisations Without AG-UI With AG-UI Framework fragmentation Every UI must code adapters for LangGraph, CrewAI, Mastra, etc. (M×N problem) Single protocol; any framework works with any UI (M+N) State synchronization Manual polling or complex WebSocket logic JSON Patch state deltas, standardized sync Human-in-the-loop Custom approval workflows per app Standardized HITL events built into protocol Debugging Server-side logs only; UI-agent communication is a black box Full client-side event tracing, transparent execution Generative UI No standard for agents to specify UI components Built-in component rendering events

Slide 23

Slide 23 text

Ecosystem Pydantic AI

Slide 24

Slide 24 text

AG-UI Server https://ai.pydantic.dev/ui/ag-ui/#examples https://docs.copilotkit.ai/pydantic-ai/quickstart/pydantic-ai?path=exiting-agent • Major framework support • "Simple" one-line expose • Fine grained control possible

Slide 25

Slide 25 text

CopilotKit Framework • Manages Communication • Helps with state • Human in the loop (HITL) • Use ready react components AG-UI Client https://docs.copilotkit.ai/pydantic-ai/quickstart/pydantic-ai?path=exiting-agent

Slide 26

Slide 26 text

Server Communication Events Category Description Lifecycle Events Monitor the progression of agent runs Text Message Events Handle streaming textual content Tool Call Events Manage tool executions by agents State Management Events Synchronize state between agents and UI Activity Events Represent ongoing activity progress Special Events Support custom functionality Draft Events Proposed events under development https://docs.ag-ui.com/concepts/events ➡ ∑16 standard events

Slide 27

Slide 27 text

• SSE (Server-Sent Events) - Default, text-based, HTTP • Binary Protocol - High performance, custom serialization • WebSocket - Bidirectional, low latency • HTTP Polling - Fallback for restrictive networks Communication Transport

Slide 28

Slide 28 text

1. TEXT_MESSAGE_START { messageId, role: "assistant" } 2. TEXT_MESSAGE_CONTENT (×N) { messageId, delta: "Hello" } { messageId, delta: " world" } 3. TEXT_MESSAGE_END { messageId } Text / Tool Streaming Event Pattern 1. TOOL_CALL_START { toolCallId: "call_123", toolName: "search_database" } 2. TOOL_CALL_ARGS { toolCallId: "call_123", args: { query: "users" } } 3. TOOL_CALL_END { toolCallId: "call_123", result: [...] }

Slide 29

Slide 29 text

• Bi-directional state management • STATE_SNAPSHOT Complete state object for initialization or refresh • STATE_DELTA JSON Patch (RFC 6902) for ef fi cient incremental updates State

Slide 30

Slide 30 text

Static • Agent speci fi es exact component and props to render • E.g. Component is mapped to a speci fi c event in the UI UI Capbilities Declarative • Agent describes intent, framework selects components • Agent knows about components and props

Slide 31

Slide 31 text

• Simple request/response with no streaming • Batch processing without user interaction • Static content delivery • Ultra-low latency requirements (<10ms) No-Usecase

Slide 32

Slide 32 text

• Keep Humans in the loop • Visual Feedback • Actual questions and commits • Increase acceptance through observability • Loading placeholder effect AG-UI

Slide 33

Slide 33 text

• Documentation: docs.ag-ui.com • GitHub: github.com/ag-ui-protocol/ag-ui • AG-UI Dojo: https://dojo.ag-ui.com/pydantic-ai/ AG-UI Ressources

Slide 34

Slide 34 text

It's a wrap! • Slides: https://thinktecture.com/max-marschall • Contact: @MaxOSchulte, [email protected]