Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Technical Summit EN 2024: ‘Talk to your systems...

Technical Summit EN 2024: ‘Talk to your systems’ - Integrating Gen AI into your architectures with structured LLM output

Talking to your data (aka RAG) is the 'Hello World' use case for LLMs. But there is much more to explore! Based on their understanding of the human language, LLMs can be used to drive innovative user interactions for applications and systems. In this session, Christian demonstrates how to use structured data output with data schemas and function calling to interconnect your APIs with the power of LLMs. Discover how to unlock the potential of your solutions by harnessing the transformative nature of Generative AI. Join this session and let's talk to your systems!

Christian Weyer

October 15, 2024
Tweet

More Decks by Christian Weyer

Other Decks in Programming

Transcript

  1. ‘Talk to your systems’ Integrating Gen AI into your architectures

    with structured LLM output Christian Weyer | Co-Founder & CTO | Thinktecture AG | [email protected]
  2. 'Talk to your systems' Integrating Gen AI into your architectures

    with structured LLM output TALK TO YOUR SYSTEMS WHY? 2
  3. 'Talk to your systems' Integrating Gen AI into your architectures

    with structured LLM output 3 Human language rocks Extending access to software
  4. 'Talk to your systems' Integrating Gen AI into your architectures

    with structured LLM output A classical UI – strong with certain use cases 4
  5. One possible UX pattern 'Talk to your systems' Integrating Gen

    AI into your architectures with structured LLM output Language-enabled “UIs” 5
  6. 'Talk to your systems' Integrating Gen AI into your architectures

    with structured LLM output TALK TO YOUR SYSTEMS HOW? 6
  7. 'Talk to your systems' Integrating Gen AI into your architectures

    with structured LLM output 7 Prompting Talk to me!
  8. 'Talk to your systems' Integrating Gen AI into your architectures

    with structured LLM output 8 ‘Function’ Calling Give it schema!
  9. 'Talk to your systems' Integrating Gen AI into your architectures

    with structured LLM output 9 Pydantic & Instructor Make it easier!
  10. 'Talk to your systems' Integrating Gen AI into your architectures

    with structured LLM output TALK TO YOUR SYSTEMS WHAT? 10
  11. 'Talk to your systems' Integrating Gen AI into your architectures

    with structured LLM output 11 End-to-End Talking to your applications
  12. 'Talk to your systems' Integrating Gen AI into your architectures

    with structured LLM output Talk to Thinktecture 12 Angular PWA Speech-to-Text Internal Gateway (Python FastAPI) LLM / SLM Text-to-Speech Transcribe spoken text Transcribed text Check for experts availability with text Extract { experts, booking times } from text Structured JSON data (Tool calling) Generate response with availability Response Response with experts availability 🗣 🔉 Speech-to-text for response Response audio Internal Business API (node.js – veeeery old) Query Availability API Availability When is CL…? CL will be…
  13. Filling Angular forms with human language input protected readonly formGroup

    = this.fb.group({ firstName: [’’], lastName: [’’], addressLine1: [’’], addressLine2: [’’], city: [’’], state: [’’], zip: [’’], country: [’’] }); 'Talk to your systems' Integrating Gen AI into your architectures with structured LLM output Smart form filling OK, nice – so here is my address then:: Peter Schmitt, Rheinstr. 7 in Schkeuditz – postcode is 04435, BTW. 13 Smart Form Filler (TS code & LLM)
  14. 'Talk to your systems' Integrating Gen AI into your architectures

    with structured LLM output TALK TO YOUR SYSTEMS RECAP 14
  15. § Human language enables new powerful use cases & access

    to our software § Always use structured output § Structured output is the secret sauce for integrating LLMs into your application architectures § Consider applying the Maybe pattern § Brings more robustness § Function Calling can be flaky § Especially with smaller models § Do not use frameworks that ‘auto-magically’ map Function Calling results to local code § Always validate return data! § Instructor is a helpful library to boost LLM use cases § Implements lots of best practices § Supports any LLM / SLM § Integrates with FastAPI 'Talk to your systems' Integrating Gen AI into your architectures with structured LLM output Recap & Recommendations 15