$30 off During Our Annual Pro Sale. View Details »

How I used Siri, PaLM, LangChain, and Firebase to create an Exobrain

Peter Friese
September 06, 2023

How I used Siri, PaLM, LangChain, and Firebase to create an Exobrain

In our fast paced world, there is just too much information, and it often seems impossible to keep up with everything that’s going on.

If you ever felt that you couldn’t possibly remember everything you saw, read, or even didn’t read, come to this talk and I will show you how I build an app that allows me to do just that.

I will show you how I
- used SwiftUI to build a beautiful app that works across Apple’s platforms
- used Cloud Firestore to store gigabytes of data, keeping it in sync across all of my devices
- used the PaLM API to summarise articles, and ask my app questions about articles
- used LangChain to connect PaLM to my personal data store
- Use Siri to provide a natural language interface that allows me to query my knowledge base hands-free

Peter Friese

September 06, 2023
Tweet

More Decks by Peter Friese

Other Decks in Programming

Transcript

  1. 􀯐
    􀯐
    Building an Exobrain
    Building an Exobrain

    View Slide

  2. Peter Friese
    Peter Friese
    Developer Relations Engineer, Firebase
    Developer Relations Engineer, Firebase
    @pete
    rf
    riese
    @pete
    rf
    riese

    View Slide

  3. Information Overload
    Information Overload

    View Slide

  4. Isn’t there an app for that?

    View Slide

  5. Not invented
    here

    View Slide

  6. Codename So
    fi
    a
    ✨ Add links via iOS Share Extension


    ✨ Readability


    ✨ Extract OpenGraph Metadata


    ✨ Summarise (via GPT-3)


    ✨ Q&A (via PaLM, Langchain, pgvector)
    Pathways Language Model
    PaLM

    View Slide

  7. Architecture
    Storing cleaned-up a
    rt
    icles for o
    ff l
    ine reading
    Summarising a
    rt
    icles
    Q&A with your knowledge base
    􀯐
    Building an exobrain

    View Slide

  8. Architecture

    View Slide

  9. client-side backend
    Architecture

    View Slide

  10. Client-side implementation
    Use a familiar language (Swi
    ft
    )
    Use APIs you know
    No need to operate servers
    Can do everything yourself

    View Slide

  11. Client-side implementation

    View Slide

  12. Firebase Backend
    Cloud Storage
    Cloud Firestore
    Serverless Architecture
    Authentication
    Client
    A miracle happens here
    ?

    View Slide

  13. Firebase Backend
    Cloud Storage
    Cloud Firestore
    Serverless Architecture
    Authentication
    Client
    ?
    Custom Backend
    Node.js

    View Slide

  14. Firebase Backend
    Cloud Storage
    Cloud Firestore
    Serverless Architecture
    Authentication
    Client
    ?

    View Slide

  15. Firebase Backend
    Cloud Storage
    Cloud Firestore
    Serverless Architecture
    Authentication
    Client
    Cloud Functions

    View Slide

  16. View Slide

  17. View Slide

  18. Swi
    ft
    ?
    NEW: Python
    JavaScript/TypeScript
    Trusted environment
    Automatic scaling
    Serverless

    View Slide

  19. View Slide

  20. How to call Cloud Functions?

    View Slide

  21. How to call Cloud Functions?
    Listen for HTTP requests Call a function from your app
    with extra context
    HTTPS Callable

    View Slide

  22. Using crontab or AppEngine
    cron.yaml syntax
    For resource-intensive, long-
    running tasks
    Scheduled In a queue
    How to call Cloud Functions?

    View Slide

  23. Data wri
    tt
    en
    New user created
    User signing in
    Image uploaded
    Crashlytics ale
    r
    Analytics Conversions
    File deleted
    Data deleted
    Test run completed
    Data updated
    New con
    fi
    guration
    Con
    fi
    guration rollback
    Triggers
    How to call Cloud Functions?

    View Slide

  24. 􀯐
    Building an exobrain
    Architecture
    Storing cleaned-up a
    rt
    icles for o
    ff l
    ine reading
    Summarising a
    rt
    icles
    Q&A with your knowledge base
    􀆅

    View Slide

  25. Storing cleaned-up HTML for o
    ff
    l
    ine reading

    View Slide

  26. Storing cleaned-up HTML for o
    ff
    l
    ine reading

    View Slide

  27. Storing cleaned-up HTML for o
    ff
    l
    ine reading
    Firebase Backend
    Cloud Firestore
    doc(
    ...
    ).onCreate()
    Server
    Web Site
    Client
    Cloud Functions

    View Slide

  28. View Slide

  29. One-Time Fetches
    Offline Mode
    Effortless Syncing

    View Slide

  30. Collection
    Sub-Collection
    Document

    View Slide

  31. Top level
    collections
    “articles” collection a single article document

    View Slide

  32. public struct Article: Identifiable, Codable {


    @DocumentID


    public var id: String?




    public var title: String


    public var author: String?


    public var readingTime: Int?


    public var url: String


    public var imageUrl: String?


    public var siteName: String


    public var dateAdded: Date


    public var excerpt: String?


    public var notes: String?


    public var isRead: Bool? = false


    public var isStarred: Bool? = false


    public var userId: String?


    public var readableHTML: String? = nil


    }

    View Slide

  33. let docRef = try db.collection("articles").addDocument(from: article)

    View Slide

  34. let docRef = try db.collection("articles").addDocument(from: article)
    Document (URL, excerpt, title, etc.)

    View Slide

  35. let docRef = try db.collection("articles").addDocument(from: article)
    Support for Codable

    View Slide

  36. let docRef = try db.collection("articles").addDocument(from: article)
    Collection path

    View Slide

  37. let docRef = try db.collection("articles").addDocument(from: article)
    Collection path

    View Slide

  38. Cloud Firestore Triggers
    onCreate(change, context)
    onUpdate(change, context)
    onDelete(change, context)
    onWrite(change, context)
    Execute a Cloud Function for the following events:
    const uid = context.auth.uid;


    const name = context.auth.token.name
    |
    |
    null;


    const picture = context.auth.token.picture
    ||
    null;


    const email = context.auth.token.email
    ||
    null;


    View Slide

  39. export const storeReadableLink =


    functions.firestore


    .document("artifacts/{documentId}")


    .onCreate(async (documentSnapshot)
    =>
    {


    const url = documentSnapshot.data().url;


    const metadata = await parser.parse(url);


    const artifactDocument = documentSnapshot.data() as ArtifactDocument;


    if (!artifactDocument.readableHTML
    &&
    metadata.content) {


    artifactDocument.readableHTML = metadata.content;


    }


    //
    update the document in Firestore


    return documentSnapshot.ref.update(artifactDocument);


    });
    Storing cleaned-up HTML for o
    ff
    l
    ine reading

    View Slide

  40. export const storeReadableLink =


    functions.firestore


    .document("artifacts/{documentId}")


    .onCreate(async (documentSnapshot)
    =>
    {


    const url = documentSnapshot.data().url;


    const metadata = await parser.parse(url);


    const artifactDocument = documentSnapshot.data() as ArtifactDocument;


    if (!artifactDocument.readableHTML
    &&
    metadata.content) {


    artifactDocument.readableHTML = metadata.content;


    }


    //
    update the document in Firestore


    return documentSnapshot.ref.update(artifactDocument);


    });
    Storing cleaned-up HTML for o
    ff
    l
    ine reading

    View Slide

  41. export const storeReadableLink =


    functions.firestore


    .document("artifacts/{documentId}")


    .onCreate(async (documentSnapshot)
    =>
    {


    const url = documentSnapshot.data().url;


    const metadata = await parser.parse(url);


    const artifactDocument = documentSnapshot.data() as ArtifactDocument;


    if (!artifactDocument.readableHTML
    &&
    metadata.content) {


    artifactDocument.readableHTML = metadata.content;


    }


    //
    update the document in Firestore


    return documentSnapshot.ref.update(artifactDocument);


    });
    Storing cleaned-up HTML for o
    ff
    l
    ine reading

    View Slide

  42. export const storeReadableLink =


    functions.firestore


    .document("artifacts/{documentId}")


    .onCreate(async (documentSnapshot)
    =>
    {


    const url = documentSnapshot.data().url;


    const metadata = await parser.parse(url);


    const artifactDocument = documentSnapshot.data() as ArtifactDocument;


    if (!artifactDocument.readableHTML
    &&
    metadata.content) {


    artifactDocument.readableHTML = metadata.content;


    }


    //
    update the document in Firestore


    return documentSnapshot.ref.update(artifactDocument);


    });
    Storing cleaned-up HTML for o
    ff
    l
    ine reading

    View Slide

  43. export const storeReadableLink =


    functions.firestore


    .document("artifacts/{documentId}")


    .onCreate(async (documentSnapshot)
    =>
    {


    const url = documentSnapshot.data().url;


    const metadata = await parser.parse(url);


    const artifactDocument = documentSnapshot.data() as ArtifactDocument;


    if (!artifactDocument.readableHTML
    &&
    metadata.content) {


    artifactDocument.readableHTML = metadata.content;


    }


    //
    update the document in Firestore


    return documentSnapshot.ref.update(artifactDocument);


    });
    Storing cleaned-up HTML for o
    ff
    l
    ine reading
    @postlight/mercury-parser

    View Slide

  44. export const storeReadableLink =


    functions.firestore


    .document("artifacts/{documentId}")


    .onCreate(async (documentSnapshot)
    =>
    {


    const url = documentSnapshot.data().url;


    const metadata = await parser.parse(url);


    const artifactDocument = documentSnapshot.data() as ArtifactDocument;


    if (!artifactDocument.readableHTML
    &&
    metadata.content) {


    artifactDocument.readableHTML = metadata.content;


    }


    //
    update the document in Firestore


    return documentSnapshot.ref.update(artifactDocument);


    });
    Storing cleaned-up HTML for o
    ff
    l
    ine reading
    Firestore document data

    View Slide

  45. export const storeReadableLink =


    functions.firestore


    .document("artifacts/{documentId}")


    .onCreate(async (documentSnapshot)
    =>
    {


    const url = documentSnapshot.data().url;


    const metadata = await parser.parse(url);


    const artifactDocument = documentSnapshot.data() as ArtifactDocument;


    if (!artifactDocument.readableHTML
    &&
    metadata.content) {


    artifactDocument.readableHTML = metadata.content;


    }


    //
    update the document in Firestore


    return documentSnapshot.ref.update(artifactDocument);


    });
    Storing cleaned-up HTML for o
    ff
    l
    ine reading

    View Slide

  46. export const storeReadableLink =


    functions.firestore


    .document("artifacts/{documentId}")


    .onCreate(async (documentSnapshot)
    =>
    {


    const url = documentSnapshot.data().url;


    const metadata = await parser.parse(url);


    const artifactDocument = documentSnapshot.data() as ArtifactDocument;


    if (!artifactDocument.readableHTML
    &&
    metadata.content) {


    artifactDocument.readableHTML = metadata.content;


    }


    //
    update the document in Firestore


    return documentSnapshot.ref.update(artifactDocument);


    });
    Storing cleaned-up HTML for o
    ff
    l
    ine reading

    View Slide

  47. export const storeReadableLink =


    functions.firestore


    .document("artifacts/{documentId}")


    .onCreate(async (documentSnapshot)
    =>
    {


    const url = documentSnapshot.data().url;


    const metadata = await parser.parse(url);


    const artifactDocument = documentSnapshot.data() as ArtifactDocument;


    if (!artifactDocument.readableHTML
    &&
    metadata.content) {


    artifactDocument.readableHTML = metadata.content;


    }


    //
    update the document in Firestore


    return documentSnapshot.ref.update(artifactDocument);


    });
    Storing cleaned-up HTML for o
    ff
    l
    ine reading

    View Slide

  48. 􀯐
    Building an exobrain
    Architecture
    Storing cleaned-up a
    rt
    icles for o
    ff l
    ine reading
    Summarising a
    rt
    icles
    Q&A with your knowledge base
    􀆅
    􀆅

    View Slide

  49. Summarising a
    rt
    icles

    View Slide

  50. Summarising a
    rt
    icles

    View Slide

  51. Summarising a
    rt
    icles
    Firebase Backend
    Cloud Firestore Server
    GPT-3
    Client
    Cloud Functions
    https.onCall()

    View Slide

  52. Callable Functions
    onCall(data, context)
    Call a Cloud Function directly from your app
    const uid = context.auth.uid;


    const name = context.auth.token.name
    |
    |
    null;


    const picture = context.auth.token.picture
    ||
    null;


    const email = context.auth.token.email
    ||
    null;


    onRequest(request, response)

    View Slide

  53. export const summarise =


    functions.https


    .onCall(async (data)
    =>
    {


    const url = data;


    const completion = await openai.createCompletion({


    model: "text-davinci-001",


    prompt: `Summarize this article: ${url}`,


    });


    return completion.data.choices[0].text;


    });


    Summarising a
    rt
    icles
    Callable function

    View Slide

  54. export const summarise =


    functions.https


    .onCall(async (data)
    =>
    {


    const url = data;


    const completion = await openai.createCompletion({


    model: "text-davinci-001",


    prompt: `Summarize this article: ${url}`,


    });


    return completion.data.choices[0].text;


    });


    Summarising a
    rt
    icles
    The function’s name

    View Slide

  55. export const summarise =


    functions.https


    .onCall(async (data)
    =>
    {


    const url = data;


    const completion = await openai.createCompletion({


    model: "text-davinci-001",


    prompt: `Summarize this article: ${url}`,


    });


    return completion.data.choices[0].text;


    });


    Summarising a
    rt
    icles
    Call the remote API

    View Slide

  56. export const summarise =


    functions.https


    .onCall(async (data)
    =>
    {


    const url = data;


    const completion = await openai.createCompletion({


    model: "text-davinci-001",


    prompt: `Summarize this article: ${url}`,


    });


    return completion.data.choices[0].text;


    });


    Summarising a
    rt
    icles
    The prompt for the LLM

    View Slide

  57. export const summarise =


    functions.https


    .onCall(async (data)
    =>
    {


    const url = data;


    const completion = await openai.createCompletion({


    model: "text-davinci-001",


    prompt: `Summarize this article: ${url}`,


    });


    return completion.data.choices[0].text;


    });


    Summarising a
    rt
    icles
    Result of the remote API

    View Slide

  58. / /
    Construct the callable endpoint


    let summarise: Callable = functions.httpsCallable("summariseURL")


    / /
    Perform the call


    do {


    return try await summarise(urlString)


    }


    catch {


    return ""


    }


    How to call a Cloud Function
    Name of the
    callable function

    View Slide

  59. / /
    Construct the callable endpoint


    let summarise: Callable = functions.httpsCallable("summariseURL")


    / /
    Perform the call


    do {


    return try await summarise(urlString)


    }


    catch {


    return ""


    }


    How to call a Cloud Function
    Input and output parameters

    View Slide

  60. / /
    Construct the callable endpoint


    let summarise: Callable = functions.httpsCallable("summariseURL")


    / /
    Perform the call


    do {


    return try await summarise(urlString)


    }


    catch {


    return ""


    }


    How to call a Cloud Function
    Enables
    callAsFunction

    View Slide

  61. / /
    Construct the callable endpoint


    let summarise: Callable = functions.httpsCallable("summariseURL")


    / /
    Perform the call


    do {


    return try await summarise(urlString)


    }


    catch {


    return ""


    }


    How to call a Cloud Function
    Call like a local
    function!

    View Slide

  62. 􀯐
    Building an exobrain
    Architecture
    Storing cleaned-up a
    ffl
    Summarising a
    rt
    icles
    Q&A with your knowledge base
    􀆅
    􀆅
    􀆅

    View Slide

  63. Q&A with your knowledge base

    View Slide

  64. Q&A with your knowledge base

    View Slide

  65. Q&A: Store knowledge
    Firebase Backend
    Cloud Firestore
    Client
    Cloud Functions

    View Slide

  66. Q&A: Store knowledge
    Firebase Backend
    Cloud Firestore PaLM API
    Google Cloud
    Client
    Cloud Functions Cloud SQL
    doc(
    ...
    ).onCreate()

    View Slide

  67. What are vector embeddings?!

    View Slide

  68. What are vector embeddings?!
    Task: Find all words that are food in
    the following sentence

    View Slide

  69. “I went down to Aberystwyth on foot to
    buy some welsh cakes and a few berries.
    When I finished doing my groceries, I had
    a latte at Coffee #1, where I met a few
    other speakers.”
    Task: Find all words that are food in
    the following sentence:

    View Slide

  70. “I went down to Aberystwyth on foot to
    buy some welsh cakes and a few berries.
    When I finished doing my groceries, I had
    a latte at Coffee #1, where I met a few
    other speakers.”
    Task: Find all words that are food in
    the following sentence:

    View Slide

  71. “I went down to Aberystwyth on foot to
    buy some welsh cakes and a few berries.
    When I finished doing my groceries, I had
    a latte at Coffee #1, where I met a few
    other speakers.”
    Task: Find all words that are food in
    the following sentence:

    View Slide

  72. Vector embedding: A numerical
    representation of a word, sentence,
    or any other unit of text.

    View Slide

  73. Vector embedding for food:


    [-0.018035058, 0.013980114, -0.01309541,


    0.024956783, 0.02708295, -0.074924484,


    0.03496225, 0.0125780115,
    .
    ..
    ]

    View Slide

  74. Vector embedding for food:


    [-0.018035058, 0.013980114, -0.01309541,


    0.024956783, 0.02708295, -0.074924484,


    0.03496225, 0.0125780115,
    .
    ..
    ]
    Vector embedding for foot:


    [-0.016025933, 0.008207399, -0.03572462,


    0.020942606, -0.0003162824, -0.041694388,


    0.050102886, 0.007380137,
    .
    ..
    ]

    View Slide

  75. [51.50721, -0.12758]
    Coordinates for London

    View Slide

  76. Paris [48.85661, 2.35222]


    London [51.50721, -0.12758]


    New York [40.71277, -74.00597]


    Boston [42.36008, -71.05888]


    View Slide

  77. Paris [48.85661, 2.35222]


    London [51.50721, -0.12758]


    New York [40.71277, -74.00597]


    Boston. [42.36008, -71.05888]


    View Slide

  78. Source: https://www.learndatasci.com/glossary/cosine-similarity/
    Cosine vector similarity

    View Slide

  79. @on_document_updated(document=“artifacts/{artifactId}",


    memory=options.MemoryOption.GB_1)


    def computeembeddings(event: Event[Change[DocumentSnapshot]])
    ->
    None:


    new_value = event.data.after.to_dict()




    url = new_value["url"]


    artifactId = event.params["artifactId"]


    store_embeddings(url, artifactId)


    Q&A: Store knowledge
    Triggered when a Firestore
    document is updated

    View Slide

  80. @on_document_updated(document=“artifacts/{artifactId}",


    memory=options.MemoryOption.GB_1)


    def computeembeddings(event: Event[Change[DocumentSnapshot]])
    ->
    None:


    new_value = event.data.after.to_dict()




    url = new_value["url"]


    artifactId = event.params["artifactId"]


    store_embeddings(url, artifactId)


    Q&A: Store knowledge
    Document path

    View Slide

  81. @on_document_updated(document=“artifacts/{artifactId}",


    memory=options.MemoryOption.GB_1)


    def computeembeddings(event: Event[Change[DocumentSnapshot]])
    ->
    None:


    new_value = event.data.after.to_dict()




    url = new_value["url"]


    artifactId = event.params["artifactId"]


    store_embeddings(url, artifactId)


    Q&A: Store knowledge
    Document ID

    View Slide

  82. @on_document_updated(document=“artifacts/{artifactId}",


    memory=options.MemoryOption.GB_1)


    def computeembeddings(event: Event[Change[DocumentSnapshot]])
    ->
    None:


    new_value = event.data.after.to_dict()




    url = new_value["url"]


    artifactId = event.params["artifactId"]


    store_embeddings(url, artifactId)


    Q&A: Store knowledge
    Public name of
    the function

    View Slide

  83. @on_document_updated(document=“artifacts/{artifactId}",


    memory=options.MemoryOption.GB_1)


    def computeembeddings(event: Event[Change[DocumentSnapshot]])
    ->
    None:


    new_value = event.data.after.to_dict()




    url = new_value["url"]


    artifactId = event.params["artifactId"]


    store_embeddings(url, artifactId)


    Q&A: Store knowledge
    Parameter contains a
    document snapshot

    View Slide

  84. @on_document_updated(document=“artifacts/{artifactId}",


    memory=options.MemoryOption.GB_1)


    def computeembeddings(event: Event[Change[DocumentSnapshot]])
    ->
    None:


    new_value = event.data.after.to_dict()




    url = new_value["url"]


    artifactId = event.params["artifactId"]


    store_embeddings(url, artifactId)


    Q&A: Store knowledge
    The NEW state of the
    document

    View Slide

  85. @on_document_updated(document=“artifacts/{artifactId}",


    memory=options.MemoryOption.GB_1)


    def computeembeddings(event: Event[Change[DocumentSnapshot]])
    ->
    None:


    new_value = event.data.after.to_dict()




    url = new_value["url"]


    artifactId = event.params["artifactId"]


    store_embeddings(url, artifactId)


    Q&A: Store knowledge
    Get the URL of the article

    View Slide

  86. @on_document_updated(document=“artifacts/{artifactId}",


    memory=options.MemoryOption.GB_1)


    def computeembeddings(event: Event[Change[DocumentSnapshot]])
    ->
    None:


    new_value = event.data.after.to_dict()




    url = new_value["url"]


    artifactId = event.params["artifactId"]


    store_embeddings(url, artifactId)


    Q&A: Store knowledge
    Document ID

    View Slide

  87. @on_document_updated(document=“artifacts/{artifactId}",


    memory=options.MemoryOption.GB_1)


    def computeembeddings(event: Event[Change[DocumentSnapshot]])
    ->
    None:


    new_value = event.data.after.to_dict()




    url = new_value["url"]


    artifactId = event.params["artifactId"]


    store_embeddings(url, artifactId)


    Q&A: Store knowledge
    Compute and store the
    embeddings for the article

    View Slide

  88. Q&A: Store knowledge
    def store_embeddings(url, artifactId):


    global lazy_vectorstore


    documents = chunk_artifact(url)


    for document in documents:


    document.metadata["artifactId"] = artifactId


    if not lazy_vectorstore:


    lazy_vectorstore = make_vectorstore()


    lazy_vectorstore.add_documents(documents=documents)
    LLMs can only handle a given
    number of tokens

    View Slide

  89. def store_embeddings(url, artifactId):


    global lazy_vectorstore


    documents = chunk_artifact(url)


    for document in documents:


    document.metadata["artifactId"] = artifactId


    if not lazy_vectorstore:


    lazy_vectorstore = make_vectorstore()


    lazy_vectorstore.add_documents(documents=documents)
    Q&A: Store knowledge
    Connect to vector store


    (Postgres + pgvector on
    CloudSQL)

    View Slide

  90. def store_embeddings(url, artifactId):


    global lazy_vectorstore


    documents = chunk_artifact(url)


    for document in documents:


    document.metadata["artifactId"] = artifactId


    if not lazy_vectorstore:


    lazy_vectorstore = make_vectorstore()


    lazy_vectorstore.add_documents(documents=documents)
    Q&A: Store knowledge
    Use LangChain to store
    embeddings in vector store

    View Slide

  91. Q&A: Retrieving knowledge
    Firebase Backend
    Cloud Firestore PaLM API
    Google Cloud
    Client
    Cloud Functions
    https.onCall()
    Cloud SQL
    semantic_qa(query)
    compute_embeddings(query)
    retrieve_similar_documents(embedding)
    generate_answer(documents, prompt)
    documents
    embeddings_for_query
    answer
    answer

    View Slide

  92. @https_fn.on_call(memory=options.MemoryOption.GB_1)


    def semanticqa(req: https_fn.CallableRequest) -> str:


    global lazy_vectorstore


    query = req.data


    prompt_template = """


    You are Sofia, the user's personal assistant that has access to all


    the knowledge the user has stored in this app.




    Given the following sections from the user's knowledge base, answer


    the question using only that information, outputted in Markdown format.




    If you are unsure and the answer is not explicitly written in the context


    sections, say "I am sorry, but I don't have access to this information."




    If you *can* answer the question, give a concise answer. Do NOT waffle around.


    Context sections:


    {context}




    Q&A: Retrieve knowledge
    Callable function

    View Slide

  93. @https_fn.on_call(memory=options.MemoryOption.GB_1)


    def semanticqa(req: https_fn.CallableRequest) -> str:


    global lazy_vectorstore


    query = req.data


    prompt_template = """


    You are Sofia, the user's personal assistant that has access to all


    the knowledge the user has stored in this app.




    Given the following sections from the user's knowledge base, answer


    the question using only that information, outputted in Markdown format.




    If you are unsure and the answer is not explicitly written in the context


    sections, say "I am sorry, but I don't have access to this information."




    If you *can* answer the question, give a concise answer. Do NOT waffle around.


    Context sections:


    {context}




    Q&A: Retrieve knowledge
    Moar memory!

    View Slide

  94. @https_fn.on_call(memory=options.MemoryOption.GB_1)


    def semanticqa(req: https_fn.CallableRequest) -> str:


    global lazy_vectorstore


    query = req.data


    prompt_template = """


    You are Sofia, the user's personal assistant that has access to all


    the knowledge the user has stored in this app.




    Given the following sections from the user's knowledge base, answer


    the question using only that information, outputted in Markdown format.




    If you are unsure and the answer is not explicitly written in the context


    sections, say "I am sorry, but I don't have access to this information."




    If you *can* answer the question, give a concise answer. Do NOT waffle around.


    Context sections:


    {context}




    Q&A: Retrieve knowledge
    Get the query

    View Slide

  95. @https_fn.on_call(memory=options.MemoryOption.GB_1)


    def semanticqa(req: https_fn.CallableRequest) -> str:


    global lazy_vectorstore


    query = req.data


    prompt_template = """


    You are Sofia, the user's personal assistant that has access to all


    the knowledge the user has stored in this app.




    Given the following sections from the user's knowledge base, answer


    the question using only that information, outputted in Markdown format.




    If you are unsure and the answer is not explicitly written in the context


    sections, say "I am sorry, but I don't have access to this information."




    If you *can* answer the question, give a concise answer. Do NOT waffle around.


    Context sections:


    {context}


    Question:


    {question}


    """


    PROMPT = PromptTemplate(



    Q&A: Retrieve knowledge
    The system prompt

    View Slide



  96. the question using only that information, outputted in Markdown format.




    If you are unsure and the answer is not explicitly written in the context


    sections, say "I am sorry, but I don't have access to this information."




    If you *can* answer the question, give a concise answer. Do NOT waffle around.


    Context sections:


    {context}


    Question:


    {question}


    """


    PROMPT = PromptTemplate(


    template=prompt_template, input_variables=["question", "context"]


    )


    chain_type_kwargs = {"prompt": PROMPT}


    connection = connection_string()


    embeddings = GooglePalmEmbeddings()




    if not lazy_vectorstore:


    lazy_vectorstore = make_vectorstore()



    Q&A: Retrieve knowledge
    Inject variables into the
    prompt

    View Slide

  97. {context}


    Question:


    {question}


    """


    PROMPT = PromptTemplate(


    template=prompt_template, input_variables=["question", "context"]


    )


    chain_type_kwargs = {"prompt": PROMPT}


    connection = connection_string()


    embeddings = GooglePalmEmbeddings()




    if not lazy_vectorstore:


    lazy_vectorstore = make_vectorstore()


    retriever = lazy_vectorstore.as_retriever()


    qa = RetrievalQA.from_chain_type(llm=GooglePalm(),


    chain_type="stuff",


    retriever=retriever,


    chain_type_kwargs=chain_type_kwargs)


    answer = qa.run(query)


    return answer


    Q&A: Retrieve knowledge
    Get the vector store

    View Slide

  98. {context}


    Question:


    {question}


    """


    PROMPT = PromptTemplate(


    template=prompt_template, input_variables=["question", "context"]


    )


    chain_type_kwargs = {"prompt": PROMPT}


    connection = connection_string()


    embeddings = GooglePalmEmbeddings()




    if not lazy_vectorstore:


    lazy_vectorstore = make_vectorstore()


    retriever = lazy_vectorstore.as_retriever()


    qa = RetrievalQA.from_chain_type(llm=GooglePalm(),


    chain_type="stuff",


    retriever=retriever,


    chain_type_kwargs=chain_type_kwargs)


    answer = qa.run(query)


    return answer


    Q&A: Retrieve knowledge
    For retrieving documents

    View Slide

  99. {context}


    Question:


    {question}


    """


    PROMPT = PromptTemplate(


    template=prompt_template, input_variables=["question", "context"]


    )


    chain_type_kwargs = {"prompt": PROMPT}


    connection = connection_string()


    embeddings = GooglePalmEmbeddings()




    if not lazy_vectorstore:


    lazy_vectorstore = make_vectorstore()


    retriever = lazy_vectorstore.as_retriever()


    qa = RetrievalQA.from_chain_type(llm=GooglePalm(),


    chain_type="stuff",


    retriever=retriever,


    chain_type_kwargs=chain_type_kwargs)


    answer = qa.run(query)


    return answer


    Q&A: Retrieve knowledge
    Use LangChain for Q&A

    View Slide

  100. {context}


    Question:


    {question}


    """


    PROMPT = PromptTemplate(


    template=prompt_template, input_variables=["question", "context"]


    )


    chain_type_kwargs = {"prompt": PROMPT}


    connection = connection_string()


    embeddings = GooglePalmEmbeddings()




    if not lazy_vectorstore:


    lazy_vectorstore = make_vectorstore()


    retriever = lazy_vectorstore.as_retriever()


    qa = RetrievalQA.from_chain_type(llm=GooglePalm(),


    chain_type="stuff",


    retriever=retriever,


    chain_type_kwargs=chain_type_kwargs)


    answer = qa.run(query)


    return answer


    Q&A: Retrieve knowledge
    Can be configured using
    different LLMs


    and data sources

    View Slide

  101. {context}


    Question:


    {question}


    """


    PROMPT = PromptTemplate(


    template=prompt_template, input_variables=["question", "context"]


    )


    chain_type_kwargs = {"prompt": PROMPT}


    connection = connection_string()


    embeddings = GooglePalmEmbeddings()




    if not lazy_vectorstore:


    lazy_vectorstore = make_vectorstore()


    retriever = lazy_vectorstore.as_retriever()


    qa = RetrievalQA.from_chain_type(llm=GooglePalm(),


    chain_type="stuff",


    retriever=retriever,


    chain_type_kwargs=chain_type_kwargs)


    answer = qa.run(query)


    return answer


    Q&A: Retrieve knowledge
    Run the chain and return
    the answer

    View Slide

  102. 􀯐
    Building an exobrain
    Architecture
    Storing cleaned-up a
    ffl
    Summarising a
    rt
    Q&A with your knowledge base
    􀆅
    􀆅
    􀆅
    􀆅

    View Slide

  103. 􀯐
    Building an exobrain
    Architecture
    Storing cleaned-up a
    ffl
    Summarising a
    rt
    Q&A with your knowledge base
    􀆅
    􀆅
    􀆅
    􀆅
    􀯐

    One more thing …

    View Slide

  104. 􀯐

    “mhh hmm?”

    View Slide

  105. struct AskQuestionIntent: AppIntent {


    @Injected(\.semanticQAService) private var semanticQAService


    static let title: LocalizedStringResource = "Ask a question"


    static let description: LocalizedStringResource = "Asks Sofia a question"


    @Parameter(title: "Question",


    description: "A question to answer from your knowledge base")


    var question: String?


    @MainActor


    func perform() async throws
    ->
    some ProvidesDialog & ShowsSnippetView {


    guard let providedPhrase = question else {


    throw $question.needsValueError("Sure - what would you like to know?")


    }


    let answer = await semanticQAService.answerQuestion(question: providedPhrase)




    Siri: App Intents App intent, discovered at compile time

    View Slide

  106. struct AskQuestionIntent: AppIntent {


    @Injected(\.semanticQAService) private var semanticQAService


    static let title: LocalizedStringResource = "Ask a question"


    static let description: LocalizedStringResource = "Asks Sofia a question"


    @Parameter(title: "Question",


    description: "A question to answer from your knowledge base")


    var question: String?


    @MainActor


    func perform() async throws
    ->
    some ProvidesDialog & ShowsSnippetView {


    guard let providedPhrase = question else {


    throw $question.needsValueError("Sure - what would you like to know?")


    }


    let answer = await semanticQAService.answerQuestion(question: providedPhrase)




    Siri: App Intents
    Inject our QA service

    View Slide

  107. struct AskQuestionIntent: AppIntent {


    @Injected(\.semanticQAService) private var semanticQAService


    static let title: LocalizedStringResource = "Ask a question"


    static let description: LocalizedStringResource = "Asks Sofia a question"


    @Parameter(title: "Question",


    description: "A question to answer from your knowledge base")


    var question: String?


    @MainActor


    func perform() async throws
    ->
    some ProvidesDialog & ShowsSnippetView {


    guard let providedPhrase = question else {


    throw $question.needsValueError("Sure - what would you like to know?")


    }


    let answer = await semanticQAService.answerQuestion(question: providedPhrase)




    Siri: App Intents Input parameter for the intent

    View Slide



  108. @Parameter(title: "Question",


    description: "A question to answer from your knowledge base")


    var question: String?


    @MainActor


    func perform() async throws
    ->
    some ProvidesDialog & ShowsSnippetView {


    guard let providedPhrase = question else {


    throw $question.needsValueError("Sure - what would you like to know?")


    }


    let answer = await semanticQAService.answerQuestion(question: providedPhrase)


    return .result(dialog: IntentDialog(stringLiteral: "Here is your answer")) {


    Markdown(answer)


    }


    }


    }


    Siri: App Intents
    Execute the intent

    View Slide



  109. @Parameter(title: "Question",


    description: "A question to answer from your knowledge base")


    var question: String?


    @MainActor


    func perform() async throws
    ->
    some ProvidesDialog & ShowsSnippetView {


    guard let providedPhrase = question else {


    throw $question.needsValueError("Sure - what would you like to know?")


    }


    let answer = await semanticQAService.answerQuestion(question: providedPhrase)


    return .result(dialog: IntentDialog(stringLiteral: "Here is your answer")) {


    Markdown(answer)


    }


    }


    }


    Siri: App Intents
    No value provided? Just ask!

    View Slide



  110. @Parameter(title: "Question",


    description: "A question to answer from your knowledge base")


    var question: String?


    @MainActor


    func perform() async throws
    ->
    some ProvidesDialog & ShowsSnippetView {


    guard let providedPhrase = question else {


    throw $question.needsValueError("Sure - what would you like to know?")


    }


    let answer = await semanticQAService.answerQuestion(question: providedPhrase)


    return .result(dialog: IntentDialog(stringLiteral: "Here is your answer")) {


    Markdown(answer)


    }


    }


    }


    Siri: App Intents
    Call backend service

    View Slide



  111. @Parameter(title: "Question",


    description: "A question to answer from your knowledge base")


    var question: String?


    @MainActor


    func perform() async throws
    ->
    some ProvidesDialog & ShowsSnippetView {


    guard let providedPhrase = question else {


    throw $question.needsValueError("Sure - what would you like to know?")


    }


    let answer = await semanticQAService.answerQuestion(question: providedPhrase)


    return .result(dialog: IntentDialog(stringLiteral: "Here is your answer")) {


    Markdown(answer)


    }


    }


    }


    Siri: App Intents
    Return answer in a dialog

    View Slide



  112. @Parameter(title: "Question",


    description: "A question to answer from your knowledge base")


    var question: String?


    @MainActor


    func perform() async throws
    ->
    some ProvidesDialog & ShowsSnippetView {


    guard let providedPhrase = question else {


    throw $question.needsValueError("Sure - what would you like to know?")


    }


    let answer = await semanticQAService.answerQuestion(question: providedPhrase)


    return .result(dialog: IntentDialog(stringLiteral: "Here is your answer")) {


    Markdown(answer)


    }


    }


    }


    Siri: App Intents
    gonzalezreal/swift-markdown-ui

    View Slide

  113. How I used Siri, PaLM,
    LangChain, and Firebase to
    create an Exobrain
    How I used Siri, PaLM,
    LangChain, and Firebase to
    create an Exobrain
    􀯐
    􀯐

    View Slide

  114. Thanks
    Thanks
    Thanks
    peterfriese.dev
    @peterfriese
    youtube.com/c/PeterFriese/
    not-only-swift.peterfriese.dev/
    @[email protected]
    youtube.com/c/Firebase

    View Slide

  115. Q&A
    Q&A
    Q&A

    View Slide