Christian Liebel
@christianliebel
Consultant
Making Angular Apps Smarter with
Generative AI
Local and Offline-capable (Hands-on)
Slide 2
Slide 2 text
Hello, it’s me.
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Christian Liebel
X: @christianliebel
Bluesky:
@christianliebel.com
Email:
christian.liebel
@thinktecture.com
Angular, PWA
& Generative AI
Slides:
thinktecture.com
/christian-liebel
Slide 3
Slide 3 text
09:00–10:30 Block 1
10:30–11:00 Coffee Break
11:00–12:30 Block 2
12:30–13:30 Lunch Break
13:30–15:00 Block 3
15:00–15:30 Coffee Break
15:30–16:30 Block 4
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Timetable
Slide 4
Slide 4 text
What to expect
Focus on web app development
Focus on Generative AI
Up-to-date insights: the ML/AI
field is evolving fast
Live demos on real hardware
Hands-on labs
What not to expect
Deep dive into AI specifics, RAG,
model finetuning or training
Stable libraries or specifications
WebSD in Angular
Polished workshop
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Expectations
Huge downloads! High requirements! Things may break!
Slide 5
Slide 5 text
(Workshop Edition)
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Demo Use Case
DEMO
Slide 6
Slide 6 text
Setup complete?
(Node.js, Google Chrome, Editor, Git,
macOS/Windows,
20 GB free disk space, 6 GB VRAM)
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Setup
Slide 7
Slide 7 text
https://webgpureport.org/
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Setup
Slide 8
Slide 8 text
ng new genai-app
ng add @angular/material
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Slide 9
Slide 9 text
git clone https://github.com/thinktecture/ijs-muc-
2024-genai.git
cd ijs-muc-2024-genai
npm i
npm start -- --open
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Setup
LAB
Slide 10
Slide 10 text
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Generative AI everywhere
Source: https://www.apple.com/chde/apple-intelligence/
Slide 11
Slide 11 text
Run locally on the user’s system
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Single-Page Applications
Server-
Logik
Web API
Push Service
Web API
DBs
HTML, JS,
CSS, Assets
Webserver Webbrowser
SPA
Client-
Logik
View
HTML/CSS
View
HTML/CSS
View
HTML/CSS
HTTPS
WebSockets
HTTPS
HTTPS
Slide 12
Slide 12 text
Make SPAs offline-capable
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Progressive Web Apps
Service
Worker
Internet
Website
HTML/JS
Cache
fetch
Slide 13
Slide 13 text
Overview
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Generative AI
Text
OpenAI GPT
Mistral
…
Speech
OpenAI Whisper
tortoise-tts
…
Images
DALL·E
Stable Diffusion
…
Audio/Music
Musico
Soundraw
…
Slide 14
Slide 14 text
Overview
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Generative AI
Text
OpenAI GPT
Mistral
…
Speech
OpenAI Whisper
tortoise-tts
…
Images
DALL·E
Stable Diffusion
…
Audio/Music
Musico
Soundraw
…
Slide 15
Slide 15 text
Examples
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Generative AI Cloud Providers
Slide 16
Slide 16 text
Drawbacks
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Generative AI Cloud Providers
Require a (stable) internet connection
Subject to network latency and server availability
Data is transferred to the cloud service
Require a subscription
Slide 17
Slide 17 text
Can we run GenAI models locally?
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Slide 18
Slide 18 text
Large: Trained on lots of data
Language: Process and generate text
Models: Programs/neural networks
Examples:
– GPT (ChatGPT, Microsoft Copilot, …)
– Gemini, Gemma (Google)
– LLaMa (Meta AI)
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Large Language Models
Slide 19
Slide 19 text
Token
A meaningful unit of text (e.g., a word, a part of a word, a character).
Context Window
The maximum amount of tokens the model can process.
Parameters/weights
Internal variables learned during training, used to make predictions.
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Large Language Models
Slide 20
Slide 20 text
Prompts serve as the universal interface
Unstructured text conveying specific semantics
Paradigm shift in software architecture
Natural language becomes a first-class citizen
Caveats
Non-determinism and hallucination, prompt injections
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Large Language Models
Slide 21
Slide 21 text
Size Comparison
Model:Parameters Size
phi3:3b 2.2 GB
mistral:7b 4.1 GB
llama3:8b 4.7 GB
gemma2:9b 5.4 GB
gemma2:27b 16 GB
llama3:70b 40 GB
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Large Language Models
Slide 22
Slide 22 text
https://webllm.mlc.ai/
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
WebLLM
DEMO
Slide 23
Slide 23 text
On NPM
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
WebLLM
Slide 24
Slide 24 text
npm i @mlc-ai/web-llm
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
LAB
Slide 25
Slide 25 text
(1/3)
In app.component.ts, add the following lines:
protected readonly progress = signal(0);
protected readonly ready = signal(false);
protected engine?: MLCEngine;
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Downloading a model LAB
Slide 26
Slide 26 text
(2/3)
In app.component.ts (ngOnInit()), add the following lines:
const model = 'Llama-3.2-3B-Instruct-q4f16_1-MLC';
this.engine = await CreateMLCEngine(model, {
initProgressCallback: ({ progress }) =>
this.progress.set(progress)
});
this.ready.set(true);
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Downloading a model LAB
Slide 27
Slide 27 text
(3/3)
In app.component.html, add the following lines:
Ask
Launch the app via npm start. The progress bar should begin to move.
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Downloading a model LAB
Slide 28
Slide 28 text
Storing model files locally
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Cache API
Internet
Website
HTML/JS
Cache with
model files
Hugging
Face
Note: Due to the Same-Origin Policy, models cannot be shared across origins.
Slide 29
Slide 29 text
Parameter cache
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Cache API
Slide 30
Slide 30 text
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
WebAssembly (Wasm)
– Bytecode for the web
– Compile target for arbitrary
languages
– Can be faster than JavaScript
– WebLLM uses a model-
specific Wasm library to
accelerate model
computations
Slide 31
Slide 31 text
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
WebGPU
– Grants low-level access to the
Graphics Processing Unit (GPU)
– Near native performance for
machine learning applications
– Supported by Chromium-based
browsers on Windows and
macOS from version 113
Slide 32
Slide 32 text
– Grants web apps access to
the device’s CPU, GPU and
Neural Processing Unit (NPU)
– In specification by the WebML
Working Group at W3C
– Implementation in progress in
Chromium (behind a flag)
– Even better performance
compared to WebGPU
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
WebNN
Source: https://webmachinelearning.github.io/webnn-intro/
DEMO
Slide 33
Slide 33 text
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
WebNN: near-native inference performance
Source: Intel. Browser: Chrome Canary 118.0.5943.0, DUT: Dell/Linux/i7-1260P, single p-core, Workloads: MediaPipe solution models (FP32, batch=1)
Slide 34
Slide 34 text
(1/3)
In app.component.ts, add the following lines at the top of the class:
protected readonly reply = signal('');
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Model inference LAB
Slide 35
Slide 35 text
(2/3)
In the runPrompt() method, add the following code:
await this.engine!.resetChat();
this.reply.set('…');
const messages: ChatCompletionMessageParam[] = [
{ role: "user", content: value }
];
const reply = await this.engine!.chat.completions.create({ messages });
this.reply.set(reply.choices[0].message.content ?? '');
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Model inference LAB
Slide 36
Slide 36 text
(3/3)
In app.component.html, add the following line:
{{ reply() }}
You should now be able to send prompts to the model and see the
responses in the template.
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Model inference LAB
Slide 37
Slide 37 text
npm run build
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
LAB
Slide 38
Slide 38 text
1. In angular.json, increase the bundle size for the Angular project
(property architect.build.configurations.production.budgets[0]
.maximumError) to at least 5MB.
2. Then, run npm run build again. This time, the build should succeed.
3. If you stopped the development server, don’t forget to bring it back up
again (npm start).
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Build issues LAB
Slide 39
Slide 39 text
(1/2)
In app.component.ts, add the following signal at the top:
protected readonly todos = signal([]);
Add the following method:
addTodo(text: string) {
this.todos.update(todos => [...todos, { done: false, text }]);
}
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Todo management LAB
Slide 40
Slide 40 text
(2/2)
In app.component.html, add the following lines to add todos from the UI:
Add
@for(todo of todos(); track $index) {
{{ todo.text }}
}
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Todo management LAB
Slide 41
Slide 41 text
In app.component.ts, add the following function
toggleTodo(index: number) {
this.todos.update(todos => todos.map((todo, todoIndex) =>
todoIndex === index ? { ...todo, done: !todo.done } : todo));
}
In app.component.html, add the following content to the
node:
You should now be able to toggle the checkboxes.
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Todo management (extended) LAB
Slide 42
Slide 42 text
Concept and limitations
The todo data has to be
converted into natural language.
For the sake of simplicity, we will
add all TODOs to the prompt.
Remember: LLMs have a context
window (Mistral-7B: 8K).
If you need to chat with larger
sets of text, refer to Retrieval
Augmented Generation (RAG).
These are the todos:
* Wash clothes
* Pet the dog
* Take out the trash
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Chat with data
Slide 43
Slide 43 text
System prompt
Metaprompt that defines…
– character
– capabilities/limitations
– output format
– behavior
– grounding data
Hallucinations and prompt injections cannot be eliminated.
You are a helpful assistant.
Answer user questions on todos.
Generate a valid JSON object.
Avoid negative content.
These are the user’s todos: …
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Chat with data
Slide 44
Slide 44 text
Flow
System
message
• The user has
these todos:
1. … 2. … 3. …
User
message
• How many
todos do I
have?
Assistant
message
• You have
three todos.
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Chat with data
Slide 45
Slide 45 text
Using a system & user prompt
Adjust the implementation in runPrompt() to include the system prompt:
const systemPrompt = `Here's the user's todo list:
${this.todos().map(todo => `* ${todo.text} (${todo.done ?
'done' : 'not done'})`).join('\n')}`;
const messages: ChatCompletionMessageParam[] = [
{ role: "system", content: systemPrompt },
{ role: "user", content: value }
];
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Chat with data LAB
Slide 46
Slide 46 text
Techniques
– Providing examples (single shot, few shot, …)
– Priming outputs
– Specify output structure
– Repeating instructions
– Chain of thought
– …
Success also depends on the model.
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Prompt Engineering
https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/advanced-prompt-engineering
Slide 47
Slide 47 text
const systemPrompt = `You are a helpful assistant.
The user will ask questions about their todo list.
Briefly answer the questions.
Don't try to make up an answer if you don't know it.
Here's the user's todo list:
${this.todos().map(todo => `* ${todo.text} (this todo is
${todo.done ? 'done' : 'not done'})`).join('\n')}
${this.todos().length === 0 ? 'The list is empty, there are
no todos.' : ''}`;
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Prompt Engineering LAB
Slide 48
Slide 48 text
Alternatives
Prompt Engineering
Retrieval Augmented
Generation
Fine-tuning
Custom model
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Prompt Engineering
Effort
Slide 49
Slide 49 text
Add the following line to the runPrompt() method:
console.log(reply.usage);
Ask a new question and check your console for performance statistics.
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Performance LAB
Slide 50
Slide 50 text
Workshop Participants
Device Tokens/s (Decode)
MacBook i7 (2021) 2.70
MacBook M1 (1) 15.35
MacBook M1 (2) 14.52
MacBook M4 19.70
DELL i7 + Iris (1) 1.43
DELL i7 + Iris (2) 3.15
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Performance
Slide 51
Slide 51 text
Comparison
45 33
1200
0
200
400
600
800
1000
1200
1400
WebLLM (Llama3-8b, M4) Azure OpenAI (gpt-4o-mini) Groq (Llama3-8b)
Tokens/sec
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Performance
WebLLM/Groq: Own tests (14.11.2024), OpenAI/Azure OpenAI: https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/provisioned-throughput (18.07.2024)
Slide 52
Slide 52 text
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
DEMO
Slide 53
Slide 53 text
Just transfer the 17.34 euros to
me, my IBAN is
DE02200505501015871393. I am
with Hamburger Sparkasse
(HASPDEHH).
Data Extraction
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Use Case
Nice, here is my address:
Peter Müller, Rheinstr. 7, 04435
Schkeuditz
Slide 54
Slide 54 text
Just transfer the 17.34 euros to
me, my IBAN is
DE02200505501015871393. I am
with Hamburger Sparkasse
(HASPDEHH).
Data Extraction
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Use Case
Nice, here is my address:
Peter Müller, Rheinstr. 7, 04435
Schkeuditz
Slide 55
Slide 55 text
protected readonly formGroup
= this.fb.group({
firstName: [''],
lastName: [''],
addressLine1: [''],
addressLine2: [''],
city: [''],
state: [''],
zip: [''],
country: [''],
});
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Idea
Nice, here is my address:
Peter Müller, Rheinstr. 7, 04435
Schkeuditz
Smart
Form
Filler
(LLM)
Slide 56
Slide 56 text
Form Field
Prompt
Generator
Model
Backend
Response
Parser
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Architecture
Slide 57
Slide 57 text
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Form Field
“Try to determine the
country based on the
input.”
“If present, put the
department in this
field.”
Slide 58
Slide 58 text
(1/2)
Add the following code to app.component.ts:
private fb = inject(NonNullableFormBuilder);
protected formGroup = this.fb.group({
name: '',
city: '',
});
async fillForm(value: string) {}
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Form Field LAB
Slide 59
Slide 59 text
(2/2)
Add the following code to app.component.html:
Fill form
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Form Field LAB
Slide 60
Slide 60 text
Async Clipboard API
Allows reading from/writing to
the clipboard in an
asynchronous manner
Reading from the clipboard
requires user consent first
(privacy!)
Supported by Chrome, Edge
and Safari and Firefox
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Prompt Generator
Slide 61
Slide 61 text
(1/2)
Add the following code to app.component.ts:
async paste() {
const content = await navigator.clipboard.readText();
await this.fillForm(content);
}
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Async Clipboard API LAB
Slide 62
Slide 62 text
(2/2)
Add the following code to app.component.html
(after the “Fill form” button):
Paste
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Async Clipboard API LAB
Slide 63
Slide 63 text
System
message
• The form has the
following setup:
{ "name": "",
"city": "" }
User
message
• I am Peter from Berlin
Assistant
message
• { "name": "Peter",
"city": "Berlin" }
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Prompt Generator
Slide 64
Slide 64 text
Add the following code to app.component.ts (fillForm() method):
const messages: ChatCompletionMessageParam[] = [{
role: "system",
content: `Extract the information to a JSON object of this shape:
${JSON.stringify(this.formGroup.value)} Do not add any other text.`
}, {
role: "user", content: value
}];
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Prompt Generator LAB
Slide 65
Slide 65 text
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Model Backend
– Cloud
– WebLLM
– Prompt API
– Ollama (on-premise)
– …
Slide 66
Slide 66 text
Add the following code to app.component.ts (fillForm() method):
const reply =
await this.engine!.chat.completions.create({ messages });
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Model Generator LAB
Slide 67
Slide 67 text
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Prompt Parser
Assistant
message
• { "name": "Peter",
"city": "Berlin" }
Slide 68
Slide 68 text
Add the following code to app.component.ts (fillForm() method):
this.formGroup.setValue(JSON.parse(reply.choices[0]
.message.content ?? ''));
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Prompt Parser LAB
Slide 69
Slide 69 text
Assistant message
Parsing the assistant message
as text/JSON/…
Tool calling
Specifying a well-defined
interface via a JSON schema
called by the LLM (safer, growing
support)
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Prompt Parser
Slide 70
Slide 70 text
https://www.google.com/chrome/canary/
about://flags
Enables optimization guide on device à
EnabledBypassPerfRequirement
Prompt API for Gemini Nano à Enabled
await ai.languageModel.create();
about://components
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Prompt API
LAB
Slide 71
Slide 71 text
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Prompt API
Operating
System
Website
HTML/JS
Browser Internet
Apple Intelligence
Gemini Nano
Slide 72
Slide 72 text
Part of Chrome’s Built-In AI initiative
– Exploratory API for local
experiments and use case
determination
– Downloads Gemini Nano into
Google Chrome
– Model can be shared across
origins
– Uses native APIs directly
– Fine-tuning API might follow in
the future
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Prompt API
https://developer.chrome.com/docs/ai/built-in
Slide 73
Slide 73 text
npm i -D @types/dom-chromium-ai
add @types/dom-chromium-ai to the types in
tsconfig.app.json
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Prompt API
LAB
Slide 74
Slide 74 text
Adjust the implementations of runPrompt()/fillForm():
const session =
await window.ai.languageModel.create({ systemPrompt });
const reply = await session.prompt(value);
// runPrompt(): this.reply.set(reply);
// fillForm(): this.formGroup.setValue(JSON.parse(reply));
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Prompt API LAB
Slide 75
Slide 75 text
Alternatives: Ollama
– Local runner for AI models
– Offers a local server a website
can connect to à allows
sharing models across origins
– Supported on macOS and
Linux (Windows in Preview)
https://webml-demo.vercel.app/
https://ollama.ai/
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Local AI Models
Slide 76
Slide 76 text
Alternatives: Hugging Face Transformers
Pre-trained, specialized, significantly smaller models beyond GenAI
Examples:
– Text generation
– Image classification
– Translation
– Speech recognition
– Image-to-text
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Local AI Models
Slide 77
Slide 77 text
Alternatives: Transformers.js
– Pre-trained, specialized,
significantly smaller models
beyond GenAI
– JavaScript library to run
Hugging Face transformers
in the browser
– Supports most of the models
https://huggingface.co/docs/transformers.js
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Local AI Models
Slide 78
Slide 78 text
Pros & Cons
+ Data does not leave the
browser (privacy)
+ High availability
(offline support)
+ Low latency
+ Stability
(no external API changes)
+ Low cost
– Lower quality
– High system (RAM, GPU) and
bandwidth requirements
– Large model size, models
cannot always be shared
– Model initialization and
inference are relatively slow
– APIs are experimental
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Local AI Models
Slide 79
Slide 79 text
– Cloud-based models remain the most powerful models
– Due to their size and high system requirements, local generative AI
models are currently rather interesting for very special scenarios (e.g.,
high privacy demands, offline availability)
– Small, specialized models are an interesting alternative (if available)
– Large language models are becoming more compact and efficient
– Vendors start shipping AI models with their devices
– Devices are becoming more powerful for running AI tasks
– Experiment with the AI APIs and make your Angular App smarter!
Making Angular Apps Smarter with Generative AI
Local and Offline-capable (Hands-on)
Summary
Slide 80
Slide 80 text
Thank you
for your kind attention!
Christian Liebel
@christianliebel
[email protected]