Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Angular-Apps smarter machen mit Gen AI: Lokal u...

Sponsored · Ship Features Fearlessly Turn features on and off without deploys. Used by thousands of Ruby developers.

Angular-Apps smarter machen mit Gen AI: Lokal und offlinefähig - Hands-on Workshop!

Generative AI ist längst im Alltag angekommen – und jetzt auch in Angular! In diesem Workshop lernen Sie, wie Sie mit Transformers.js und der Prompt API KI-Funktionalität direkt in Ihre Anwendungen integrieren. Entwickeln Sie einen Chatbot und Formulare, die sich dank AI selbst ausfüllen. Lokal, offlinefähig und hands-on: So bringen Sie Generative AI praxisnah in Ihr Frontend.

Generative AI prägt die Zukunft der Softwareentwicklung. KI-gestützte Funktionen halten in immer mehr Anwendungen Einzug – von Office bis Windows. In diesem Workshop erfahren Sie, wie Sie dieselben Technologien direkt in Ihre Angular-Projekte integrieren können.
Mit Transformers.js und der Prompt API bringen Sie leistungsfähige Sprachmodelle lokal in den Browser – ohne Serveranbindung oder Cloud-Abhängigkeit. Gemeinsam erweitern wir eine bestehende TODO-Anwendung um einen Chatbot und intelligente Formulare, die sich mithilfe von KI automatisch ausfüllen. Dabei lernen Sie, wie Prompts gestaltet, Modelle eingebunden und Ergebnisse kontrolliert werden.
Neben theoretischem Hintergrundwissen steht die praktische Umsetzung im Vordergrund: Sie entwickeln aktiv mit und nehmen direkt nutzbaren Code mit nach Hause.

Avatar for Christian Liebel

Christian Liebel PRO

March 02, 2026
Tweet

More Decks by Christian Liebel

Other Decks in Programming

Transcript

  1. Hello, it’s me. Angular-Apps smarter machen mit Generative AI: lokal

    und offlinefähig Hands-on Workshop! Christian Liebel X: @christianliebel Bluesky: @christianliebel.com Email: christian.liebel @thinktecture.com Angular, PWA & Generative AI Slides: thinktecture.com /christian-liebel
  2. Original 09:00–10:30 Block 1 10:30–11:00 Coffee Break 11:00–12:30 Block 2

    12:30–13:30 Lunch Break 13:30–15:00 Block 3 15:00–15:30 Coffee Break 15:30–17:00 Block 4 Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Timetable
  3. What to expect Focus on web app development Focus on

    Generative AI Up-to-date insights: the ML/AI field is evolving fast Live demos on real hardware 17 hands-on labs What not to expect Deep dive into AI specifics, RAG, model finetuning or training Stable libraries or specifications Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Expectations Huge downloads! High requirements! Things may break!
  4. (Workshop Edition) Angular-Apps smarter machen mit Generative AI: lokal und

    offlinefähig Hands-on Workshop! Demo Use Case DEMO
  5. Setup complete? (Node.js, Google Chrome, Editor, Git, macOS/Windows, 20 GB

    free disk space, 6 GB VRAM) Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Setup
  6. git clone https://github.com/thinktecture/basta- spring-2026-genai.git cd basta-spring-2026-genai npm i Angular-Apps smarter

    machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Setup LAB #0
  7. Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on

    Workshop! Generative AI everywhere Source: https://www.apple.com/chde/apple-intelligence/
  8. Run locally on the user’s system Angular-Apps smarter machen mit

    Generative AI: lokal und offlinefähig Hands-on Workshop! Single-Page Applications
  9. Make SPAs offline-capable Angular-Apps smarter machen mit Generative AI: lokal

    und offlinefähig Hands-on Workshop! Progressive Web Apps Service Worker Internet Website HTML/JS Cache fetch
  10. Overview Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig

    Hands-on Workshop! Generative AI Text OpenAI GPT Mistral … Audio/Music Musico Soundraw … Images DALL·E Firefly … Video Sora Runway … Speech Whisper tortoise-tts …
  11. Overview Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig

    Hands-on Workshop! Generative AI Text OpenAI GPT Mistral … Audio/Music Musico Soundraw … Images DALL·E Firefly … Video Sora Runway … Speech Whisper tortoise-tts …
  12. Drawbacks Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig

    Hands-on Workshop! Generative AI Cloud Providers Require a (stable) internet connection Subject to network latency and server availability Data is transferred to the cloud service Require a subscription
  13. Can we run GenAI models locally? Angular-Apps smarter machen mit

    Generative AI: lokal und offlinefähig Hands-on Workshop!
  14. Large: Trained on lots of data Language: Process and generate

    text Models: Programs/neural networks Examples: – Haiku, Opus, Sonnet (Claude) – Gemini, Gemma (Google) – LFM (Liquid AI) Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Large Language Models
  15. Token A meaningful unit of text (e.g., a word, a

    part of a word, a character). Context Window The maximum amount of tokens the model can process. Parameters/weights Internal variables learned during training, used to make predictions. Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Large Language Models
  16. Prompts serve as the universal interface Unstructured text conveying specific

    semantics Paradigm shift in software architecture Natural language becomes a first-class citizen Caveats Non-determinism and hallucination, prompt injections Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Large Language Models
  17. npm i @huggingface/transformers npm start -- -o Angular-Apps smarter machen

    mit Generative AI: lokal und offlinefähig Hands-on Workshop! LAB #1
  18. @huggingface/transformers – JavaScript library by Hugging Face 🤗 – Functionally

    equivalent to Hugging Face’s transformers python lib – Supports various ML/AI use cases (LLMs, computer vision, audio, …) – Models are executed on-device (100% local, offline-capable) – Uses ONNX Runtime (model inference runtime) internally Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Transformers.js
  19. Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on

    Workshop! Transformers.js vs. WebLLM Source: https://npmtrends.com/@huggingface/transformers-vs-@mlc-ai/web-llm (27.02.2026)
  20. Size Comparison Model:Parameters Size lfm2.5-thinking:1.2b 0.7 GB lfm2:2.6b 1.6 GB

    ministral-3:3b 3.0 GB gemma3:12b 8.1 GB gpt-oss:20b 14 GB devstral-2:123b 75 GB Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Model Selection
  21. Liquid Foundation Models (LFM) Highly performant on-device LLMs by LiquidAI

    https://www.liquid.ai/models For this workshop, we are going to use the 2.6B model. Model sheet: https://huggingface.co/onnx- community/LFM2-2.6B-ONNX Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Model Selection Source: https://www.liquid.ai/blog/introducing-lfm2-2-6b-redefining-efficiency-in-language-models
  22. (1/3) Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig

    Hands-on Workshop! Downloading a model LAB #2 1. Go to webgpureport.org 2. Does your GPU support the feature “shader-f16”?
  23. (2/3) 3. In todo.ts (ngOnInit()), add the following line: await

    this.llmService.loadModel('2.6B'); 4. If your GPU does not support f16, add this parameter: await this.llmService.loadModel('2.6B', 'q4'); Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Downloading a model LAB #2
  24. (3/3) 5. In todo.html, change the following lines: @if(!llmService.isReady()) {

    <mat-progress-bar mode="determinate" [value]="llmService.progress()"></mat-progress-bar> } … <button mat-raised-button (click)="runPrompt(prompt.value, langModel.value)" [disabled]="!llmService.isReady()"> The progress bar should begin to move. Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Downloading a model LAB #2
  25. Storing model files locally Angular-Apps smarter machen mit Generative AI:

    lokal und offlinefähig Hands-on Workshop! Cache API Internet Website HTML/JS Cache with model files Hugging Face Note: Due to the Same-Origin Policy, models cannot be shared across origins.
  26. Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on

    Workshop! WebAssembly (Wasm) – Bytecode for the web – Compile target for arbitrary languages – Can be faster than JavaScript – WebLLM uses a model- specific Wasm library to accelerate model computations
  27. Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on

    Workshop! WebGPU – Grants low-level access to the Graphics Processing Unit (GPU) – Near native performance for machine learning applications – Supported by Chromium-based browsers on Windows and macOS from version 113, Safari 26, and Firefox 141 on Windows
  28. – Grants web apps access to the device’s CPU, GPU

    and Neural Processing Unit (NPU) – In specification by the WebML Working Group at W3C – Origin Trial in Chrome 146 – Potentially even better performance compared to WebGPU Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! WebNN Source: https://webmachinelearning.github.io/webnn-intro/
  29. Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig WebNN

    https://huggingface.co/webnn/spaces Hands-on Workshop! DEMO
  30. Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on

    Workshop! WebNN: near-native inference performance Source: Intel. Browser: Chrome Canary 118.0.5943.0, DUT: Dell/Linux/i7-1260P, single p-core, Workloads: MediaPipe solution models (FP32, batch=1)
  31. Drawbacks Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig

    WebNN Models can’t be shared across origins Inference is fast, but doesn’t reach full native speed Hands-on Workshop!
  32. (1/4) 1. In todo.ts, add the following line at the

    top of the class: protected readonly reply = signal(''); Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Model inference LAB #3
  33. (2/4) 2. In the runPrompt() method, add the following code:

    this.reply.set('…'); const chunks = inferenceEngine === 'transformers-js' ? this.inferTransformersJs(userPrompt) : this.inferPromptApi(userPrompt); let reply = ''; for await (const chunk of chunks) { reply += chunk; this.reply.set(reply); } Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Model inference LAB #3
  34. (3/4) 3. In the inferTransformersJs() method, add the following code:

    this.llmService.clearPastKeyValues(); const messages = [ { role: "user", content: userPrompt }, ]; return this.llmService.generateResponse(messages, []); Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Model inference LAB #3
  35. (4/4) 4. In todo.html, change the following line: <pre>{{ reply()

    }}</pre> You should now be able to send prompts to the model and see the responses in the template. ⚠ Note: Browsers support better options for streaming LLM responses: https://developer.chrome.com/docs/ai/render-llm-responses Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Model inference LAB #3
  36. (1/2) In todo.ts, add the following signal at the top:

    protected readonly todos = signal<TodoDto[]>([]); Add the following line to the addTodo() method: text ??= prompt() ?? ''; this.todos.update(todos => [...todos, { done: false, text }]); Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Todo management LAB #4
  37. (2/2) In todo.html, add the following lines to add todos

    from the UI: @for (todo of todos(); track $index) { <mat-list-option>{{ todo.text }}</mat-list-option> } Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Todo management LAB #4
  38. @for (todo of todos(); track $index) { <mat-list-option [(selected)]="todo.done"> {{

    todo.text }} </mat-list-option> } ⚠ Boo! This pattern is not recommended. Instead, you should set the changed values on the signal. But this messes up with Angular Material… Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Todo management (extended) LAB #5
  39. Concept and limitations The todo data has to be converted

    into natural language. For the sake of simplicity, we will add all TODOs to the prompt. Remember: LLMs have a context window (LFM2-2.6B: 32K). If you need to chat with larger sets of text, refer to Retrieval Augmented Generation (RAG). These are the todos: * Wash clothes * Pet the dog * Take out the trash Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Chat with data
  40. System prompt Metaprompt that defines… – character – capabilities/limitations –

    output format – behavior – grounding data Hallucinations and prompt injections cannot be eliminated. You are a helpful assistant. Answer user questions on todos. Generate a valid JSON object. Avoid negative content. These are the user’s todos: … Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Chat with data
  41. Flow System message • The user has these todos: 1.

    … 2. … 3. … User message • How many todos do I have? Assistant message • You have three todos. Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Chat with data
  42. Using a system & user prompt Adjust the code in

    inferTransformerJs() to include the system prompt: const systemPrompt = `Here's the user's todo list: ${JSON.stringify(this.todos())}`; const messages: ChatCompletionMessageParam[] = [ { role: "system", content: systemPrompt }, { role: "user", content: userPrompt } ]; Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Chat with data LAB #6
  43. Techniques – Providing examples (single shot, few shot, …) –

    Priming outputs – Specify output structure – Repeating instructions – Chain of thought – … Success also depends on the model. Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Prompt Engineering https://learn.microsoft.com/en-us/azure/ai-foundry/openai/concepts/prompt-engineering
  44. const systemPrompt = `You are a helpful assistant. The user

    will ask questions about their todo list. Briefly answer the questions. Don't try to make up an answer if you don't know it. Here's the user's todo list: ${JSON.stringify(this.todos())}`; Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Prompt Engineering LAB #7
  45. Alternatives Prompt Engineering Retrieval Augmented Generation Fine-tuning Custom model Angular-Apps

    smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Prompt Engineering Effort
  46. Adjust todo.ts as follows: return this.llmService.generateResponse(messages, [], { measurePerformance: true,

    }); Ask a new question and check your console for performance statistics. Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Performance LAB #8
  47. Workshop Participants Device Tokens/s (Decode) Angular-Apps smarter machen mit Generative

    AI: lokal und offlinefähig Hands-on Workshop! Performance
  48. Comparison 45 33 1200 0 200 400 600 800 1000

    1200 1400 WebLLM (Llama3-8b, M4) Azure OpenAI (gpt-4o-mini) Groq (Llama3-8b) Tokens/sec Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Performance WebLLM/Groq: Own tests (14.11.2024), OpenAI/Azure OpenAI: https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/provisioned-throughput (18.07.2024)
  49. The LLM “calls” the tool. However, the developer has to

    take care of actually executing the code and to feed back the result into the conversation. Tool Calling allows an LLM to execute “real-world” actions. A tool usually has… – a name – a natural-language description – an interface definition usually in JSON Schema add_todo get_weather search_web read_file Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Tool Calling
  50. 1. Add the TODO_TOOL to the tools array in inferTransformersJs():

    return this.llmService.generateResponse(messages, [TODO_TOOL], { measurePerformance: true }); 2. Add this line to the end of the runPrompt() method: this.llmService.executeToolCalls(reply, { addTodo: (args: { text: string }) => this.addTodo(args.text), }); Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Tool Calling LAB #9
  51. Bring Your Own AI (BYOAI) – Libraries – WebLLM –

    Frameworks – Transformers.js – ONNX Runtime – TensorFlow.js – APIs – WebGPU, WebNN – Cross-Origin Storage NEW! Built-in AI (BIAI) – Writing Assistance APIs – Summarizer API – Writer API – Rewriter API – Proofreader API NEW! – Translator & Language Detector APIs – Prompt API NEW! Multimodal input & structured output – WebMCP NEW! Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Web AI Landscape Hands-on Workshop!
  52. In current version of Chrome or Edge: about://flags Enables optimization

    guide on device à EnabledBypassPerfRequirement Prompt API for Gemini Nano à Enabled await LanguageModel.create(); about://components about://on-device-internals Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Prompt API LAB #10
  53. Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on

    Workshop! Prompt API Operating System Website HTML/JS Browser Internet Apple Intelligence Gemini Nano
  54. Part of Chrome’s Built-In AI initiative – Exploratory API for

    local experiments and use case determination – Downloads Gemini Nano into Google Chrome – Model can be shared across origins – Uses native APIs directly – Fine-tuning API might follow in the future Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Prompt API https://developer.chrome.com/docs/ai/built-in
  55. npm i -D @types/dom-chromium-ai add "dom-chromium-ai" to the types in

    tsconfig.app.json Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Prompt API LAB #11
  56. Add the following lines to inferPromptApi(): const systemPrompt = `

    The user will ask questions about their todo list. Here's the user's todo list: ${JSON.stringify(this.todos())}`; const languageModel = await LanguageModel.create({ initialPrompts: [{ role: "system", content: systemPrompt }] }); return languageModel.promptStreaming(userPrompt); Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Local AI Models LAB #12
  57. Alternatives: Ollama – Local runner for AI models – Offers

    a local server a website can connect to à allows sharing models across origins – Supported on Windows, Linux, macOS https://ollama.ai/ Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Local AI Models
  58. Alternatives: Hugging Face Transformers Pre-trained, specialized, significantly smaller models beyond

    GenAI Examples: – Text generation – Image classification – Translation – Speech recognition – Image-to-text Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Local AI Models
  59. Alternatives: Transformers.js – Pre-trained, specialized, significantly smaller models beyond GenAI

    – JavaScript library to run Hugging Face transformers in the browser https://huggingface.co/docs/transformers.js Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Local AI Models
  60. On-device pipeline Angular-Apps smarter machen mit Generative AI: lokal und

    offlinefähig Realtime Models Whisper (STT) Silero (VAD) SmolLM2- 1.7B (LLM) Kokoro (TTS) Hands-on Workshop!
  61. Just transfer the 17.34 euros to me, my IBAN is

    DE02200505501015871393. I am with Hamburger Sparkasse (HASPDEHH). Data Extraction Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Use Case Nice, here is my address: Peter Müller, Rheinstr. 7, 04435 Schkeuditz
  62. Just transfer the 17.34 euros to me, my IBAN is

    DE02200505501015871393. I am with Hamburger Sparkasse (HASPDEHH). Data Extraction Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Use Case Nice, here is my address: Peter Müller, Rheinstr. 7, 04435 Schkeuditz
  63. protected readonly formGroup = this.fb.group({ firstName: [''], lastName: [''], addressLine1:

    [''], addressLine2: [''], city: [''], state: [''], zip: [''], country: [''], }); Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Idea Nice, here is my address: Peter Müller, Rheinstr. 7, 04435 Schkeuditz Smart Form Filler (LLM)
  64. Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on

    Workshop! Form Field “Insurance numbers always start with INS.” “Try to determine the country based on the input.”
  65. (1/2) Add the following code to form.ts: private readonly fb

    = inject(NonNullableFormBuilder); protected readonly formGroup = this.fb.group({ name: '', city: '', }); async fillForm(value: string) {} Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Form Field LAB #13
  66. (2/2) Add the following code to form.html: <input type="text" #form>

    <button (click)="fillForm(form.value)">Fill form</button> <form [formGroup]="formGroup"> <input placeholder="Name" formControlName="name"> <input placeholder="City" formControlName="city"> </form> Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Form Field LAB #13
  67. Async Clipboard API Allows reading from/writing to the clipboard in

    an asynchronous manner Reading from the clipboard requires user consent first (privacy!) Supported by Chrome, Edge and Safari and Firefox Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Prompt Generator
  68. (1/2) Add the following code to form.ts: async paste() {

    const content = await navigator.clipboard.readText(); await this.fillForm(content); } Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Async Clipboard API LAB #14
  69. (2/2) Add the following code to form.html (after the “Fill

    form” button): <button (click)="paste()">Paste</button> Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Async Clipboard API LAB #14
  70. System message • The form has the following setup: {

    "name": "", "city": "" } User message • I am Peter from Berlin Assistant message • { "name": "Peter", "city": "Berlin" } Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Prompt Generator
  71. Add the following code to the fillForm() method: const languageModel

    = await LanguageModel.create({ initialPrompts: [{ role: 'system', content: `Extract the information to a JSON object of this shape: ${JSON.stringify(this.formGroup.value)}`, }], }); const result = await languageModel.prompt(value); console.log(result); Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Prompt Generator LAB #15
  72. Add the following code to form.ts (fillForm() method): const result

    = await languageModel.prompt(value, { responseConstraint: { type: 'object', properties: { name: { type: 'string' }, city: { type: 'string' } } } }); Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Prompt Generator (Structured Output) LAB #16
  73. Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on

    Workshop! Prompt Parser Assistant message • { "name": "Peter", "city": "Berlin" }
  74. Add the following code to form.ts (fillForm() method): this.formGroup.setValue(JSON.parse(result)); Angular-Apps

    smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Prompt Parser LAB #17
  75. Assistant message Parsing the assistant message as text/JSON/… JSON Mode

    Tool calling Specifying a well-defined interface via a JSON schema called by the LLM (safer, growing support) Structured Output Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Prompt Parser
  76. Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig WebMCP

    – Allows websites to expose tools to the browser or external agents – Joint effort by Microsoft and Google https://github.com/webmachinelearning/webmcp Hands-on Workshop!
  77. API Imperative navigator.modelContext .provideContext({ tools: [{ "name": "start_game", "description": "Start

    a new game.", "inputSchema": {}, "execute": () => {}, } ]}); Declarative <form id="reservationForm" toolname="book_table_le_petit_bistr o" tooldescription=...> <input name="name" toolparamdescription="Customer's full name (min 2 chars)" /> </form> Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! WebMCP
  78. Pros & Cons + Data does not leave the browser

    (privacy) + High availability (offline support) + Low latency + Stability (no external API changes) + Low cost – Lower quality – High system (RAM, GPU) and bandwidth requirements – Large model size, models cannot always be shared – Model initialization and inference are relatively slow – APIs are experimental Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Summary
  79. – Cloud-based models remain the most powerful models – Due

    to their size and high system requirements, local generative AI models are currently rather interesting for very special scenarios (e.g., high privacy demands, offline availability) – Small, specialized models are an interesting alternative (if available) – Large language models are becoming more compact and efficient – Vendors are shipping AI models with their devices – Devices are becoming more powerful for running AI workloads – Experiment with the AI APIs and make your Angular App smarter! Angular-Apps smarter machen mit Generative AI: lokal und offlinefähig Hands-on Workshop! Summary