Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Blabla Conf : Quarkus meets AI : Build your ow...

Blabla Conf : Quarkus meets AI : Build your own LLM-powered application

In today's dynamic AI landscape, the seamless integration of Large Language Models (LLMs) into applications is a key focus for developers. While many initiatives have emerged to facilitate the integration of LLMs, the world of Java has seen limited options.

Enter Langchain4j, a powerful library designed to seamlessly integrate Java applications with LLMs. The excitement amplifies Langchain4j into Quarkus, a framework designed for building Cloud-Native applications in Java. Quarkus is tuned for Kubernetes environments boasting faster startup times and reduced memory usage compared to traditional Java applications. When Quarkus meets Langchain4j, the process of building a Java LLM-powered application becomes an enjoyable experience.

In this talk, we’ll delve into building AI applications powered by LLMs, using Quarkus and Langchain4j. We’ll leverage existing features from the ecosystem to create effective strategies for data ingestion. We’ll demonstrate how to seamlessly bring in information from a broader set of resources, with the power of Apache Camel.

Zineb Bendhiba

October 10, 2024
Tweet

More Decks by Zineb Bendhiba

Other Decks in Programming

Transcript

  1. Zineb Bendhiba - Open Source Engineer at Red Hat -

    Apache Camel PMC - International Speaker - 15+ years professional software development experience - Speak English, French, Moroccan Darija, Arabic - University of Cadi Ayyad Alumni - https://zinebbendhiba.com - Twitter: @ZinebBendhiba
  2. Large Language Model - 101 Model Pre-training (self-supervised) • Prepare/configure

    the billion parameters* • Collect/tokenize trillions of data* • Let the model self-train on that data • Model learns to predict the next word ◦ E.g. "once upon a" ⇒ "time" (99%) ◦ But it can also make up stuff! • Requires $M(MM) in GPU/processing power and days to weeks of training Model Fine-tuning (supervised) • Curate hundreds of high quality chats (Q&As) • Let the model retrain on the new data set • Model learns to mimic the chat behaviour ◦ Understands instructions ◦ Using the pre-trained knowledge! • [Retrain to make it Helpful, Honest, Harmless] ◦ Reinforcement Learning from Human Feedback (RHFL) • Takes a day of processing for each cycle Turn Base Model into a Chat Assistant Model Evaluate, Deploy, Monitor, Re-train
  3. You can do *a lot* with a generic model …and

    enough context (docs, emails, other data)
  4. Components of LangChain4j Prompt Templates Language Models Output Parsers Memory

    Document Splitters Document Loaders Embedding Models Embedding Stores Basics RAG Image Models New! Chains AI Services Tools
  5. • Build time ◦ Build time warnings ◦ Compile to

    native with GraalVM • Production Enhancements ◦ Unified Configuration ◦ Metrics, Tracing, Auditing • Programming Model ◦ Seamless integration with CDI ◦ Simpler declarative AI Services (@RegisterAiService) • Developer Joy ◦ 'quarkus dev' enables iterative testing and Prompt tuning. ◦ Dev UI allows to view AI services, tools, config, add/search embeddings, test prompts, generate images • Performance Enhancements ◦ Optimized Quarkus REST/JSON libs ◦ Reduced library footprint Why LangChain4j with Quarkus? +
  6. - Quarkus Langchain 4j documentation - Quarkus Langchain4j samples -

    Apache Camel project - Quarkus project - Langchain4j project - Demo - My Apache Camel talk af Blabla conf 2021 - WIP: Camel Langhcain4j component Links