Slide 1

Slide 1 text

A HANDS-ON AI AGENT GUIDE with Gemma, Ollama and more!

Slide 2

Slide 2 text

πŸ₯‘ Developer Relations Engineer at deepset 󰎲 πŸ— Open source LLM Framework: Haystack πŸŽ“ Sabanci University B.Sc. 🧠 KU Leuven M.Sc. πŸ“ Istanbul, Turkey ABOUT ME @BILGEYCL /IN/BILGE-YUCEL BILGEYUCEL Bilge Yucel

Slide 3

Slide 3 text

AGENDA ● AI AGENTS ● BUILDING TOOLS ● 󰟲 BUILDING TIME ● Q&A

Slide 4

Slide 4 text

AI AGENTS An agent in an LLM-powered system is an autonomous entity that uses a language model as its core to plan, reflect, store memory, and leverage external tools for executing complex tasks and interacting with external data sources efficiently

Slide 5

Slide 5 text

AI AGENTS Overview of a LLM-powered autonomous agent system. Source: https://lilianweng.github.io/posts/2023-06-23-agent/

Slide 6

Slide 6 text

● Can you mail me the latest Hacker News articles? β—‹ Reason + Act = ReAct β—‹ Tool calls, function calling β–  find latest Hacker News articles β–  generate an email β–  send it AI AGENTS EXAMPLES

Slide 7

Slide 7 text

● Generate JSON from {this} text β—‹ Self-reflection: Generates a JSON, verifies it with the second iteration AI AGENTS EXAMPLES

Slide 8

Slide 8 text

● Who is Taylor Swift? + When was she born? β—‹ Memory: β€œWhen was Taylor Swift born?” AI AGENTS EXAMPLES

Slide 9

Slide 9 text

BUILDING TOOLS ● Gemma 2 ● Ollama ● Haystack

Slide 10

Slide 10 text

GEMMA 2 ● An open language model by Google ● 2b, 9b, 27b ● 8192 tokens ● wide range of tasks: question answering, commonsense reasoning, mathematics, science, and coding ● Gemma License ● More info in Gemma 2: Improving Open Language Models at a Practical Size

Slide 11

Slide 11 text

OLLAMA OLLAMA ● OSS to run models locally on macOS, Linux, and Windows ● Models come with Ollama ● Mostly 4-bit quantization ● GPU -> CPU ● Mostly for PoC

Slide 12

Slide 12 text

OLLAMA OLLAMA

Slide 13

Slide 13 text

OLLAMA How to use OLLAMA ● Install Ollama to your computer ● >> ollama pull ● >> ollama serve

Slide 14

Slide 14 text

OLLAMA HAYSTACK ● Haystack is an open source Python framework for building production-ready LLM applications ● RAG, Semantic search, Agent… ● Prototyping, evaluation, deployment, monitoring… ● Building blocks: Components & Pipelines

Slide 15

Slide 15 text

OLLAMA Web-Enhanced Self-Reflecting Agent ● Retrieval Augmented Generation (RAG) ● If not enough context, goes to web to enrich the context 🌐 ● If enough context, finishes βœ…

Slide 16

Slide 16 text

OLLAMA Web-Enhanced Self-Reflecting Agent

Slide 17

Slide 17 text

OLLAMA Web-Enhanced Self-Reflecting Agent

Slide 18

Slide 18 text

OLLAMA Web-Enhanced Self-Reflecting Agent

Slide 19

Slide 19 text

OLLAMA PROMPT user {% if web_documents %} You were asked to answer the following query given the documents but the context was not enough. Here is the user question: {{ query }} Context: {% for document in documents %} {{document.content}} {% endfor %} {% for document in web_documents %} URL: {{document.meta.link}} TEXT: {{document.content}} β€” {% endfor %} Answer the question based on the given context. If you have enough context to answer this question, return your answer with the used links. If you don't have enough context to answer, say 'N0_ANSWER'. {% else %} Answer the following query based on the documents retrieved from Haystack's documentation. Documents: {% for document in documents %} {{document.content}} {% endfor %} Query: {{query}} If you have enough context to answer this question, just return your answer If you don't have enough context to answer, say 'N0_ANSWER'. {% endif %} model

Slide 20

Slide 20 text

OLLAMA PROMPT user {% if web_documents %} You were asked to answer the following query given the documents but the context was not enough. Here is the user question: {{ query }} Context: {% for document in documents %} {{document.content}} {% endfor %} {% for document in web_documents %} URL: {{document.meta.link}} TEXT: {{document.content}} β€” {% endfor %} Answer the question based on the given context. If you have enough context to answer this question, return your answer with the used links. If you don't have enough context to answer, say 'N0_ANSWER'. {% else %} Answer the following query based on the documents retrieved from Haystack's documentation. Documents: {% for document in documents %} {{document.content}} {% endfor %} Query: {{query}} If you have enough context to answer this question, just return your answer If you don't have enough context to answer, say 'N0_ANSWER'. {% endif %} model

Slide 21

Slide 21 text

OLLAMA PIPELINE

Slide 22

Slide 22 text

OLLAMA LET’S RUN

Slide 23

Slide 23 text

OLLAMA LET’S RUN

Slide 24

Slide 24 text

OLLAMA Q&A @BILGEYCL /IN/BILGE-YUCEL BILGEYUCEL Web-Enhanced Self-Reflecting Agent