Slide 1

Slide 1 text

‘Talk to your systems’ Structured Output & Tool Calling - Rückgrat für LLM-Integration in eigene Anwendungen Christian Weyer | Co-Founder & CTO | Thinktecture AG | [email protected]

Slide 2

Slide 2 text

'Talk to your systems' Integrating Gen AI into your architectures with structured LLM output TALK TO YOUR SYSTEMS WHY? 2

Slide 3

Slide 3 text

'Talk to your systems' Integrating Gen AI into your architectures with structured LLM output 3 Human language rocks Extending access to software

Slide 4

Slide 4 text

'Talk to your systems' Integrating Gen AI into your architectures with structured LLM output A classical UI – strong with certain use cases 4

Slide 5

Slide 5 text

One possible UX pattern 'Talk to your systems' Integrating Gen AI into your architectures with structured LLM output Language-enabled “UIs” 5

Slide 6

Slide 6 text

'Talk to your systems' Integrating Gen AI into your architectures with structured LLM output 6 LLMs Enabling new scenarios

Slide 7

Slide 7 text

§ LLMs are always part of end-to-end architectures § Client apps (Web, desktop, mobile) § Services with APIs § Databases § etc. § LLM is an additional asset with an API in your architecture § Enabling human language as a first-class citizen 'Talk to your systems' Integrating Gen AI into your architectures with structured LLM output End-to-end architectures with LLMs 7 Clients Services LLMs Desktop Web Mobile Service A Service B Service C API Gateway Monitoring LLM 1 LLM 2

Slide 8

Slide 8 text

'Talk to your systems' Integrating Gen AI into your architectures with structured LLM output TALK TO YOUR SYSTEMS HOW? 8

Slide 9

Slide 9 text

'Talk to your systems' Integrating Gen AI into your architectures with structured LLM output 9 Prompting Talk to me!

Slide 10

Slide 10 text

'Talk to your systems' Integrating Gen AI into your architectures with structured LLM output 10 ‘Function Calling’ Give it schema!

Slide 11

Slide 11 text

'Talk to your systems' Integrating Gen AI into your architectures with structured LLM output 11 Pydantic & Instructor Make it easier!

Slide 12

Slide 12 text

'Talk to your systems' Integrating Gen AI into your architectures with structured LLM output TALK TO YOUR SYSTEMS WHAT? 12

Slide 13

Slide 13 text

'Talk to your systems' Integrating Gen AI into your architectures with structured LLM output 13 End-to-End Talking to your applications

Slide 14

Slide 14 text

'Talk to your systems' Integrating Gen AI into your architectures with structured LLM output Talk to Thinktecture 14 Angular PWA Speech-to-Text Internal Gateway (Python FastAPI) LLM / SLM Text-to-Speech Transcribe spoken text Transcribed text Check for experts availability with text Extract { experts, booking times } from text Structured JSON data (Tool calling) Generate response with availability Response Response with experts availability 🗣 🔉 Speech-to-text for response Response audio Internal Business API (node.js – veeeery old) Query Availability API Availability When is CL…? CL will be…

Slide 15

Slide 15 text

Filling Angular forms with human language input protected readonly formGroup = this.fb.group({ firstName: [’’], lastName: [’’], addressLine1: [’’], addressLine2: [’’], city: [’’], state: [’’], zip: [’’], country: [’’] }); 'Talk to your systems' Integrating Gen AI into your architectures with structured LLM output Smart form filling OK, nice – so here is my address then:: Peter Schmitt, Rheinstr. 7 in Schkeuditz – postcode is 04435, BTW. 15 Smart Form Filler (TS code & LLM)

Slide 16

Slide 16 text

'Talk to your systems' Integrating Gen AI into your architectures with structured LLM output TALK TO YOUR SYSTEMS RECAP 16

Slide 17

Slide 17 text

§ Human language enables new powerful use cases & access to our software § Always use structured output § Structured output is the secret sauce for integrating LLMs into your application architectures § Consider applying the Maybe pattern § Brings more robustness § Function Calling can be flaky § Especially with smaller models § Do not use frameworks that ‘auto-magically’ map Function Calling results to local code § Always validate return data! § Instructor is a helpful library to boost LLM use cases § Implements lots of best practices § Supports any LLM / SLM § Integrates with FastAPI 'Talk to your systems' Integrating Gen AI into your architectures with structured LLM output Recap & Recommendations 17

Slide 18

Slide 18 text

Thank you! Christian Weyer https://thinktecture.com/christian-weyer 18 https://github.com/thinktecture-labs/talk-to-your-systems https://github.com/thinktecture-labs/smart-form-filler