Upgrade to Pro — share decks privately, control downloads, hide ads and more …

From Transactions to Agents: PostgreSQL in Mode...

Sponsored · Your Podcast. Everywhere. Effortlessly. Share. Educate. Inspire. Entertain. You do you. We'll handle the rest.

From Transactions to Agents: PostgreSQL in Modern AI Applications

Modern AI applications are no longer just chatbots—they are agentic systems that reason, retrieve context, call tools, and maintain state across many user interactions. For application developers, the hardest part isn’t model selection—it’s building an architecture that scales without sacrificing latency, correctness, or developer velocity. In this session, we’ll walk through a PostgreSQL-centric architectural pattern for building scalable agentic AI applications using Databricks Lakebase (Postgres-compatible OLTP) alongside the Databricks Lakehouse.
session link: https://postgresconf.org/conferences/postgresconf_2026/program/proposals/from-transactions-to-agents-postgresql-in-modern-ai-applications

Avatar for Amey Banarse

Amey Banarse

April 21, 2026

More Decks by Amey Banarse

Other Decks in Technology

Transcript

  1. ©2025 Databricks Inc. — All rights reserved About Me Amey

    Banarse Field Engineering Leader, Databricks Previously - YugabyteDB ♦ Pivotal ♦ FINRA San Francisco, CA @ameybanarse about.me/amey
  2. DATA AI At Databricks, our mission is to democratize Data

    democratized through AI AI democratized through your data
  3. ©2025 Databricks Inc. — All rights reserved Database technology was

    built for a different era Built for on-prem Very expensive Massive lock-in
  4. ©2025 Databricks Inc. — All rights reserved AI is turning

    everyone into application developers…
  5. ©2025 Databricks Inc. — All rights reserved Agentic Memory Online

    Feature Store Internal Tools Real-Time Visualization Web Apps …And every application needs a database
  6. ©2025 Databricks Inc. — All rights reserved AI is creating

    a huge disruption Agent s Human s October 2024 November 2024 December 2024 January 2025 February 2025 March 2025 April 2025 AI agents create 4x more databases than humans on Neon ~80% agents ~30% agents
  7. ©2025 Databricks Inc. — All rights reserved Transactional processing designed

    for the agentic era Built on open source Postgres foundation Support for community extensions Separate compute and storage Very low latency Very high QPS Production SLAs Designed for Agentic AI Launches in < 1 second Scales safely under agent-driven concurrency Pay only for what you use ©2025 Databricks Inc. — All rights reserved Lakebase
  8. ©2025 Databricks Inc. — All rights reserved Lakebase accelerates App

    Dev lifecycle Transactional processing designed for the agentic era
  9. ©2025 Databricks Inc. — All rights reserved Data Intelligence Platform

    AI/BI Agentic business intelligence Lakeflow Ingest, ETL, streaming DBSQL Data warehousing Apps Secure data & AI apps Agent Bricks Agentic AI & Machine Learning Lakebase Serverless Postgres All data under your control Unified governance for data and AI
  10. ©2025 Databricks Inc. — All rights reserved What can you

    do with Lakebase? Order history (analytics) Chatbot history (training data) Lakebase Lakehouse Analyze transactional data in the lakehouse Lakebase Lakehouse Personalized recommendations Customer segmentation Feature store Serve data from the lakehouse Application Lakebase Order processing Interactive workflow sign-off State for an agent Build your app on Lakebase
  11. ©2024 Databricks Inc. — All rights reserved Databricks AI DevKit

    Trusted access to AI coding assistants Apps Secure data & AI apps Lakebas e Transactional database Agent Bricks Agentic AI Genie Business intelligence Lakehouse Data Warehousing
  12. TrueClaimIQ From Document to Decision in Seconds — Operational AI

    Built on Low-Latency Enterprise Intelligence for Revenue Cost Management (RCM) Business Challenge This Healthtech firm’s operational intelligence product monitors billing operations workflows at hospitals, capturing user event data to optimize the claims approval process. Their current system relies on rules-based OCR to extract identifiers (account/patient IDs) from workflow screenshots — but it struggles with identifier extraction accuracy. Without accurate ID extraction, they can't reliably link user workflow data to EHR records, breaking the chain needed to measure outcomes and surface actionable insights to their hospital customers. Databricks Solution We can supercharge productivity and accelerate growth by: • Foundational Model API for batch inference on screenshots • Lakebase for operationalizing extraction performance data • Databricks Apps for “Human-in-the-loop” review of extraction performance • Lakeflow for orchestrating the end-to-end screenshot processing pipeline What is the Impact? 65-80% Accuracy increase at a similar price point 50-70% Reduction in manual linkage work 100% UC governance and customer satisfaction* *we hope :)
  13. Pain Points ~55% Accuracy in claims extraction ~20% Claims require

    laborious manual linkage Legacy: Claim Extraction Pipeline 14 Claim Image Intake Streaming (2M/day) Rules Based Extraction (RPA) Manual Review of Failing Claim Extractions Claims Processing Insights External Human-in-the-loop Review App Patient Visit Provider Submits Claim Manual Label Linkage Provider Legacy Process
  14. Value Drivers 90+% Claims extraction accuracy at a lower cost

    point 20-30% Reduction in end-to-end processing time AI-Native: Claim Extraction Pipeline 15 Claim Image Intake Streaming (2M/day) AI Based Extraction (AI Functions) Automated Selection of Low Confidence Reviews Claims Processing Insights Human-in-the-loop Review App Patient Visit Provider Submits Claim Provider TrueClaimIQ
  15. TrueClaimIQ’s 3 Layer Approach 1 Lakehouse Intelligence Utilize AI Functions

    with Spark Declarative Pipelines to extract text and summarize usage from complicated images 2 Operational Insights Native application state management and operationalization of data next to where the Lakehouse lives 3 Learning Loop In app powered analytics over: model accuracy, task and reviewer-specific performance, drift, and compliance traceability Relying on external tools for apps, databases, and reverse ETL; taking huge development efforts to ultimately reload your data back into Databricks Instead of … A brittle rules based system or RPA, requiring manual linkage for failover support and lacking intelligence to understand usage Instead of … Building fine-tuned custom models and rolling external analytic tools; saving time, increasing accuracy and driving greater levels of activity Instead of … An operationally governed, continuously improving AI system
  16. Real World Scenario : AI-Native: Claim Extraction Pipeline 4 Batch

    Inferencing (Foundational Model API) 5 Lakebase + Lakehouse Synced Tables 1 Databricks Apps 2 Lakebase 3 Spark Declarative Pipelines 6 Unity Catalog
  17. ©2025 Databricks Inc. — All rights reserved AI Functions •

    ai_parse_doc + ai_extract + ai_query Foundation Model API • Anthropic Opus 4.6 for text extraction • Agent Memory Embedding model - GTE models Lakeflow Synced Tables • Real-time data flow from Lakehouse to Lakebase Layer 1 : Intelligence + Lakehouse LLM
  18. ©2025 Databricks Inc. — All rights reserved Databricks Apps •

    React + FastAPI Agents on Apps • LangChain & LangGraph ReAct & MLflow Lakebase • App backend + Agentic Memory (pgvector) Layer 2 : Operational Insights Architecture
  19. ©2025 Databricks Inc. — All rights reserved Lakehouse Synced Tables

    • Real-time data flow from Lakebase to Lakehouse Spark Declarative Pipelines • Automated CDC from Lakebase & Lakehouse sources DBSQL API + MCP Server • Traditional and agentic tool calling of DBSQL for executing SQL Layer 3 : Learning Loop Architecture
  20. Why Databricks? 2 Operational Insights • Low-latency • Scalability •

    Portability 1 Lakehouse Intelligence • End-to-End Governance • Multi-Model AI Inference • Incremental Declarative Pipelines 3 Learning Loop • Managed Forward + Reverse ETL • Managed MLOps • Responsive Analytics Databricks helps customers build, optimize and deploy production-grade apps - focused on quality, cost-efficiency and all grounded in your enterprise data.
  21. ©2025 Databricks Inc. — All rights reserved 20% Improvement in

    organizational efficiency Challenge Ensemble was operating across 15 fragmented data systems with inconsistent schemas, making it impossible to govern data effectively or power AI-driven revenue-cycle operations. Solution Ensemble unified over 2PB of provider, payer, and clinical data on the Databricks Data Intelligence Platform. With Lakebase streamlining access, teams gained real-time, self-service insights that powered predictive modeling and AI workflows for proactive denial prevention and optimized revenue cycle operations. Impact Up to 5% Lift in net revenue yield for customers, year over year 2PB+ Data enriched within Databricks Platform ©2026 Databricks Inc. — All rights reserved Read the case study >
  22. ©2025 Databricks Inc. — All rights reserved 1.5B+ Records unified

    across six data sources Challenge Quantum evaluates complex energy investments using 1.5B+ records across multiple data sources. As deal volume and complexity grew, teams needed a consistent, governed way to share and analyze data across tools without slowing human-led diligence. Solution Quantum adopted Lakebase to bring relational structure and governance to its data, making it easier for teams to access, share, and work from the same trusted datasets. Familiar, high-performance data access enabled faster collaboration and deeper analysis across deals. Impact 100+ Redundant tables eliminated 11,000+ Monthly queries across teams ©2026 Databricks Inc. — All rights reserved Read the case study >