Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Building Agentic AI Solutions With Custom Tools

Sam Ayo
September 25, 2024
7

Building Agentic AI Solutions With Custom Tools

Sam Ayo

September 25, 2024
Tweet

Transcript

  1. Building Agentic AI Solutions With Custom Tools SAM AYO Lead

    AI Engineer & Head of Engineering https://www.linkedin.com/in/sam-ayo https://www.x.com/officialsamayo
  2. • Academic background: Economics, Math, Stats, ARTIBA • Areas of

    Interest: Core AI, NLP, Audio AI, AI Engineering, probabilistic models, experimentation & system inference design. • Programming Languages: Python, C++, C#, Golang, JavaScript, TypeScript. • Recent work: Real-time Agentic system, near real-time audio signal detection, Semantic relation modelling and search. • Industries covered: Agnostic • Fun fact: Built LangChain equivalent in golang, contributor to pinecone vector DB
  3. An Agentic System is a system that exhibits planning, decision-making,

    & environmental interaction in an intelligent-like manner.
  4. LLM can make plans for the actions it intends to

    take. This can include one or more future actions of the system. Planning Context awareness Thought creation Components of AI Agents
  5. For Agent systems to hold up, having certain criteria or

    algorithms that’s associated with making decision is critical. This is critical because it allows AI agents to take on broad task where the decision space is very wide. Decision Making Reasoning Articulation of steps to solving complex problems Components of AI Agents
  6. • High effort to design • Depends on a resilient

    multi-architecture AI system • Enables non-LLM capabilities Environmental Interaction Tool Use Tool Recognition Components of AI Agents
  7. Environment Interaction(RAG) Chunking strategy Your chunking strategy depends on what

    your data looks like and what you need from it. Embedding strategy Your embedding strategy depends on your accuracy, cost and use case needs. It involves: • Embedding chunks directly • Embedding sub and super chunks • Incorporating chunking metadata What you must consider: • Chunk size (fixed size, paragraph, semantic) • Chunk overlap • Chunk splitters What you must consider: • Accuracy • Appropriateness for task • Speed of computation • Length of output vector • Size of input
  8. Building Custom tools is heavily dependent on your capability to

    innovatively think around tool names and description. As a rule, tool description are divided into: • How to use the tool - Instructive • what the tool does – Descriptive • Input to interact with the tool Building Custom Tools
  9. • High effort to design • Depends on a resilient

    multi-architecture AI system • Enables non-LLM capabilities Building Custom Tools with LangChain
  10. Identifying GPUs on AWS Instance Type GPU Size/Description Use case

    EC2 G4dn Nvidia T4 Typically 15GiB VRAM. The lowest cost GPU-based instances in the cloud for machine learning inference and small scale training. Great for prototyping and releasing micro models to production and cost effective. EC2 G6 Nvidia L4 Typically 22GiB VRAM. Offers 2x better performance for deep learning inference and graphics workloads compared to G4dn instances. Ideal for production workloads. Deployment of ML models for natural language processing, language translation, video and image analysis, speech recognition. EC2 G5 Nvidia A10G Typically 24GiB VRAM. Offers 3x better performance for deep learning inference and graphics workloads compared to G4dn instances. Ideal for production complex workloads.