$30 off During Our Annual Pro Sale. View Details »

How I used Siri, PaLM, LangChain, and Firebase ...

Peter Friese
September 06, 2023

How I used Siri, PaLM, LangChain, and Firebase to create an Exobrain

In our fast paced world, there is just too much information, and it often seems impossible to keep up with everything that’s going on.

If you ever felt that you couldn’t possibly remember everything you saw, read, or even didn’t read, come to this talk and I will show you how I build an app that allows me to do just that.

I will show you how I
- used SwiftUI to build a beautiful app that works across Apple’s platforms
- used Cloud Firestore to store gigabytes of data, keeping it in sync across all of my devices
- used the PaLM API to summarise articles, and ask my app questions about articles
- used LangChain to connect PaLM to my personal data store
- Use Siri to provide a natural language interface that allows me to query my knowledge base hands-free

Peter Friese

September 06, 2023
Tweet

More Decks by Peter Friese

Other Decks in Programming

Transcript

  1. Codename So fi a ✨ Add links via iOS Share

    Extension ✨ Readability ✨ Extract OpenGraph Metadata ✨ Summarise (via GPT-3) ✨ Q&A (via PaLM, Langchain, pgvector) Pathways Language Model PaLM
  2. Architecture Storing cleaned-up a rt icles for o ff l

    ine reading Summarising a rt icles Q&A with your knowledge base 􀯐 Building an exobrain
  3. Client-side implementation Use a familiar language (Swi ft ) Use

    APIs you know No need to operate servers Can do everything yourself
  4. How to call Cloud Functions? Listen for HTTP requests Call

    a function from your app with extra context HTTPS Callable
  5. Using crontab or AppEngine cron.yaml syntax For resource-intensive, long- running

    tasks Scheduled In a queue How to call Cloud Functions?
  6. Data wri tt en New user created User signing in

    Image uploaded Crashlytics ale r Analytics Conversions File deleted Data deleted Test run completed Data updated New con fi guration Con fi guration rollback Triggers How to call Cloud Functions?
  7. 􀯐 Building an exobrain Architecture Storing cleaned-up a rt icles

    for o ff l ine reading Summarising a rt icles Q&A with your knowledge base 􀆅
  8. Storing cleaned-up HTML for o ff l ine reading Firebase

    Backend Cloud Firestore doc( ... ).onCreate() Server Web Site Client Cloud Functions
  9. public struct Article: Identifiable, Codable { @DocumentID public var id:

    String? public var title: String public var author: String? public var readingTime: Int? public var url: String public var imageUrl: String? public var siteName: String public var dateAdded: Date public var excerpt: String? public var notes: String? public var isRead: Bool? = false public var isStarred: Bool? = false public var userId: String? public var readableHTML: String? = nil }
  10. Cloud Firestore Triggers onCreate(change, context) onUpdate(change, context) onDelete(change, context) onWrite(change,

    context) Execute a Cloud Function for the following events: const uid = context.auth.uid; const name = context.auth.token.name | | null; const picture = context.auth.token.picture || null; const email = context.auth.token.email || null;
  11. export const storeReadableLink = functions.firestore .document("artifacts/{documentId}") .onCreate(async (documentSnapshot) => {

    const url = documentSnapshot.data().url; const metadata = await parser.parse(url); const artifactDocument = documentSnapshot.data() as ArtifactDocument; if (!artifactDocument.readableHTML && metadata.content) { artifactDocument.readableHTML = metadata.content; } // update the document in Firestore return documentSnapshot.ref.update(artifactDocument); }); Storing cleaned-up HTML for o ff l ine reading
  12. export const storeReadableLink = functions.firestore .document("artifacts/{documentId}") .onCreate(async (documentSnapshot) => {

    const url = documentSnapshot.data().url; const metadata = await parser.parse(url); const artifactDocument = documentSnapshot.data() as ArtifactDocument; if (!artifactDocument.readableHTML && metadata.content) { artifactDocument.readableHTML = metadata.content; } // update the document in Firestore return documentSnapshot.ref.update(artifactDocument); }); Storing cleaned-up HTML for o ff l ine reading
  13. export const storeReadableLink = functions.firestore .document("artifacts/{documentId}") .onCreate(async (documentSnapshot) => {

    const url = documentSnapshot.data().url; const metadata = await parser.parse(url); const artifactDocument = documentSnapshot.data() as ArtifactDocument; if (!artifactDocument.readableHTML && metadata.content) { artifactDocument.readableHTML = metadata.content; } // update the document in Firestore return documentSnapshot.ref.update(artifactDocument); }); Storing cleaned-up HTML for o ff l ine reading
  14. export const storeReadableLink = functions.firestore .document("artifacts/{documentId}") .onCreate(async (documentSnapshot) => {

    const url = documentSnapshot.data().url; const metadata = await parser.parse(url); const artifactDocument = documentSnapshot.data() as ArtifactDocument; if (!artifactDocument.readableHTML && metadata.content) { artifactDocument.readableHTML = metadata.content; } // update the document in Firestore return documentSnapshot.ref.update(artifactDocument); }); Storing cleaned-up HTML for o ff l ine reading
  15. export const storeReadableLink = functions.firestore .document("artifacts/{documentId}") .onCreate(async (documentSnapshot) => {

    const url = documentSnapshot.data().url; const metadata = await parser.parse(url); const artifactDocument = documentSnapshot.data() as ArtifactDocument; if (!artifactDocument.readableHTML && metadata.content) { artifactDocument.readableHTML = metadata.content; } // update the document in Firestore return documentSnapshot.ref.update(artifactDocument); }); Storing cleaned-up HTML for o ff l ine reading @postlight/mercury-parser
  16. export const storeReadableLink = functions.firestore .document("artifacts/{documentId}") .onCreate(async (documentSnapshot) => {

    const url = documentSnapshot.data().url; const metadata = await parser.parse(url); const artifactDocument = documentSnapshot.data() as ArtifactDocument; if (!artifactDocument.readableHTML && metadata.content) { artifactDocument.readableHTML = metadata.content; } // update the document in Firestore return documentSnapshot.ref.update(artifactDocument); }); Storing cleaned-up HTML for o ff l ine reading Firestore document data
  17. export const storeReadableLink = functions.firestore .document("artifacts/{documentId}") .onCreate(async (documentSnapshot) => {

    const url = documentSnapshot.data().url; const metadata = await parser.parse(url); const artifactDocument = documentSnapshot.data() as ArtifactDocument; if (!artifactDocument.readableHTML && metadata.content) { artifactDocument.readableHTML = metadata.content; } // update the document in Firestore return documentSnapshot.ref.update(artifactDocument); }); Storing cleaned-up HTML for o ff l ine reading
  18. export const storeReadableLink = functions.firestore .document("artifacts/{documentId}") .onCreate(async (documentSnapshot) => {

    const url = documentSnapshot.data().url; const metadata = await parser.parse(url); const artifactDocument = documentSnapshot.data() as ArtifactDocument; if (!artifactDocument.readableHTML && metadata.content) { artifactDocument.readableHTML = metadata.content; } // update the document in Firestore return documentSnapshot.ref.update(artifactDocument); }); Storing cleaned-up HTML for o ff l ine reading
  19. export const storeReadableLink = functions.firestore .document("artifacts/{documentId}") .onCreate(async (documentSnapshot) => {

    const url = documentSnapshot.data().url; const metadata = await parser.parse(url); const artifactDocument = documentSnapshot.data() as ArtifactDocument; if (!artifactDocument.readableHTML && metadata.content) { artifactDocument.readableHTML = metadata.content; } // update the document in Firestore return documentSnapshot.ref.update(artifactDocument); }); Storing cleaned-up HTML for o ff l ine reading
  20. 􀯐 Building an exobrain Architecture Storing cleaned-up a rt icles

    for o ff l ine reading Summarising a rt icles Q&A with your knowledge base 􀆅 􀆅
  21. Callable Functions onCall(data, context) Call a Cloud Function directly from

    your app const uid = context.auth.uid; const name = context.auth.token.name | | null; const picture = context.auth.token.picture || null; const email = context.auth.token.email || null; onRequest(request, response)
  22. export const summarise = functions.https .onCall(async (data) => { const

    url = data; const completion = await openai.createCompletion({ model: "text-davinci-001", prompt: `Summarize this article: ${url}`, }); return completion.data.choices[0].text; }); Summarising a rt icles Callable function
  23. export const summarise = functions.https .onCall(async (data) => { const

    url = data; const completion = await openai.createCompletion({ model: "text-davinci-001", prompt: `Summarize this article: ${url}`, }); return completion.data.choices[0].text; }); Summarising a rt icles The function’s name
  24. export const summarise = functions.https .onCall(async (data) => { const

    url = data; const completion = await openai.createCompletion({ model: "text-davinci-001", prompt: `Summarize this article: ${url}`, }); return completion.data.choices[0].text; }); Summarising a rt icles Call the remote API
  25. export const summarise = functions.https .onCall(async (data) => { const

    url = data; const completion = await openai.createCompletion({ model: "text-davinci-001", prompt: `Summarize this article: ${url}`, }); return completion.data.choices[0].text; }); Summarising a rt icles The prompt for the LLM
  26. export const summarise = functions.https .onCall(async (data) => { const

    url = data; const completion = await openai.createCompletion({ model: "text-davinci-001", prompt: `Summarize this article: ${url}`, }); return completion.data.choices[0].text; }); Summarising a rt icles Result of the remote API
  27. / / Construct the callable endpoint let summarise: Callable<String, String>

    = functions.httpsCallable("summariseURL") / / Perform the call do { return try await summarise(urlString) } catch { return "" } How to call a Cloud Function Name of the callable function
  28. / / Construct the callable endpoint let summarise: Callable<String, String>

    = functions.httpsCallable("summariseURL") / / Perform the call do { return try await summarise(urlString) } catch { return "" } How to call a Cloud Function Input and output parameters
  29. / / Construct the callable endpoint let summarise: Callable<String, String>

    = functions.httpsCallable("summariseURL") / / Perform the call do { return try await summarise(urlString) } catch { return "" } How to call a Cloud Function Enables callAsFunction
  30. / / Construct the callable endpoint let summarise: Callable<String, String>

    = functions.httpsCallable("summariseURL") / / Perform the call do { return try await summarise(urlString) } catch { return "" } How to call a Cloud Function Call like a local function!
  31. 􀯐 Building an exobrain Architecture Storing cleaned-up a ffl Summarising

    a rt icles Q&A with your knowledge base 􀆅 􀆅 􀆅
  32. Q&A: Store knowledge Firebase Backend Cloud Firestore PaLM API Google

    Cloud Client Cloud Functions Cloud SQL doc( ... ).onCreate()
  33. “I went down to Aberystwyth on foot to buy some

    welsh cakes and a few berries. When I finished doing my groceries, I had a latte at Coffee #1, where I met a few other speakers.” Task: Find all words that are food in the following sentence:
  34. “I went down to Aberystwyth on foot to buy some

    welsh cakes and a few berries. When I finished doing my groceries, I had a latte at Coffee #1, where I met a few other speakers.” Task: Find all words that are food in the following sentence:
  35. “I went down to Aberystwyth on foot to buy some

    welsh cakes and a few berries. When I finished doing my groceries, I had a latte at Coffee #1, where I met a few other speakers.” Task: Find all words that are food in the following sentence:
  36. Vector embedding for food: [-0.018035058, 0.013980114, -0.01309541, 0.024956783, 0.02708295, -0.074924484,

    0.03496225, 0.0125780115, . .. ] Vector embedding for foot: [-0.016025933, 0.008207399, -0.03572462, 0.020942606, -0.0003162824, -0.041694388, 0.050102886, 0.007380137, . .. ]
  37. @on_document_updated(document=“artifacts/{artifactId}", memory=options.MemoryOption.GB_1) def computeembeddings(event: Event[Change[DocumentSnapshot]]) -> None: new_value = event.data.after.to_dict()

    url = new_value["url"] artifactId = event.params["artifactId"] store_embeddings(url, artifactId) Q&A: Store knowledge Triggered when a Firestore document is updated
  38. @on_document_updated(document=“artifacts/{artifactId}", memory=options.MemoryOption.GB_1) def computeembeddings(event: Event[Change[DocumentSnapshot]]) -> None: new_value = event.data.after.to_dict()

    url = new_value["url"] artifactId = event.params["artifactId"] store_embeddings(url, artifactId) Q&A: Store knowledge Document path
  39. @on_document_updated(document=“artifacts/{artifactId}", memory=options.MemoryOption.GB_1) def computeembeddings(event: Event[Change[DocumentSnapshot]]) -> None: new_value = event.data.after.to_dict()

    url = new_value["url"] artifactId = event.params["artifactId"] store_embeddings(url, artifactId) Q&A: Store knowledge Public name of the function
  40. @on_document_updated(document=“artifacts/{artifactId}", memory=options.MemoryOption.GB_1) def computeembeddings(event: Event[Change[DocumentSnapshot]]) -> None: new_value = event.data.after.to_dict()

    url = new_value["url"] artifactId = event.params["artifactId"] store_embeddings(url, artifactId) Q&A: Store knowledge Parameter contains a document snapshot
  41. @on_document_updated(document=“artifacts/{artifactId}", memory=options.MemoryOption.GB_1) def computeembeddings(event: Event[Change[DocumentSnapshot]]) -> None: new_value = event.data.after.to_dict()

    url = new_value["url"] artifactId = event.params["artifactId"] store_embeddings(url, artifactId) Q&A: Store knowledge The NEW state of the document
  42. @on_document_updated(document=“artifacts/{artifactId}", memory=options.MemoryOption.GB_1) def computeembeddings(event: Event[Change[DocumentSnapshot]]) -> None: new_value = event.data.after.to_dict()

    url = new_value["url"] artifactId = event.params["artifactId"] store_embeddings(url, artifactId) Q&A: Store knowledge Get the URL of the article
  43. @on_document_updated(document=“artifacts/{artifactId}", memory=options.MemoryOption.GB_1) def computeembeddings(event: Event[Change[DocumentSnapshot]]) -> None: new_value = event.data.after.to_dict()

    url = new_value["url"] artifactId = event.params["artifactId"] store_embeddings(url, artifactId) Q&A: Store knowledge Compute and store the embeddings for the article
  44. Q&A: Store knowledge def store_embeddings(url, artifactId): global lazy_vectorstore documents =

    chunk_artifact(url) for document in documents: document.metadata["artifactId"] = artifactId if not lazy_vectorstore: lazy_vectorstore = make_vectorstore() lazy_vectorstore.add_documents(documents=documents) LLMs can only handle a given number of tokens
  45. def store_embeddings(url, artifactId): global lazy_vectorstore documents = chunk_artifact(url) for document

    in documents: document.metadata["artifactId"] = artifactId if not lazy_vectorstore: lazy_vectorstore = make_vectorstore() lazy_vectorstore.add_documents(documents=documents) Q&A: Store knowledge Connect to vector store (Postgres + pgvector on CloudSQL)
  46. def store_embeddings(url, artifactId): global lazy_vectorstore documents = chunk_artifact(url) for document

    in documents: document.metadata["artifactId"] = artifactId if not lazy_vectorstore: lazy_vectorstore = make_vectorstore() lazy_vectorstore.add_documents(documents=documents) Q&A: Store knowledge Use LangChain to store embeddings in vector store
  47. Q&A: Retrieving knowledge Firebase Backend Cloud Firestore PaLM API Google

    Cloud Client Cloud Functions https.onCall() Cloud SQL semantic_qa(query) compute_embeddings(query) retrieve_similar_documents(embedding) generate_answer(documents, prompt) documents embeddings_for_query answer answer
  48. @https_fn.on_call(memory=options.MemoryOption.GB_1) def semanticqa(req: https_fn.CallableRequest) -> str: global lazy_vectorstore query =

    req.data prompt_template = """ You are Sofia, the user's personal assistant that has access to all the knowledge the user has stored in this app. Given the following sections from the user's knowledge base, answer the question using only that information, outputted in Markdown format. If you are unsure and the answer is not explicitly written in the context sections, say "I am sorry, but I don't have access to this information." If you *can* answer the question, give a concise answer. Do NOT waffle around. Context sections: {context} Q&A: Retrieve knowledge Callable function
  49. @https_fn.on_call(memory=options.MemoryOption.GB_1) def semanticqa(req: https_fn.CallableRequest) -> str: global lazy_vectorstore query =

    req.data prompt_template = """ You are Sofia, the user's personal assistant that has access to all the knowledge the user has stored in this app. Given the following sections from the user's knowledge base, answer the question using only that information, outputted in Markdown format. If you are unsure and the answer is not explicitly written in the context sections, say "I am sorry, but I don't have access to this information." If you *can* answer the question, give a concise answer. Do NOT waffle around. Context sections: {context} Q&A: Retrieve knowledge Moar memory!
  50. @https_fn.on_call(memory=options.MemoryOption.GB_1) def semanticqa(req: https_fn.CallableRequest) -> str: global lazy_vectorstore query =

    req.data prompt_template = """ You are Sofia, the user's personal assistant that has access to all the knowledge the user has stored in this app. Given the following sections from the user's knowledge base, answer the question using only that information, outputted in Markdown format. If you are unsure and the answer is not explicitly written in the context sections, say "I am sorry, but I don't have access to this information." If you *can* answer the question, give a concise answer. Do NOT waffle around. Context sections: {context} Q&A: Retrieve knowledge Get the query
  51. @https_fn.on_call(memory=options.MemoryOption.GB_1) def semanticqa(req: https_fn.CallableRequest) -> str: global lazy_vectorstore query =

    req.data prompt_template = """ You are Sofia, the user's personal assistant that has access to all the knowledge the user has stored in this app. Given the following sections from the user's knowledge base, answer the question using only that information, outputted in Markdown format. If you are unsure and the answer is not explicitly written in the context sections, say "I am sorry, but I don't have access to this information." If you *can* answer the question, give a concise answer. Do NOT waffle around. Context sections: {context} Question: {question} """ PROMPT = PromptTemplate( Q&A: Retrieve knowledge The system prompt
  52. the question using only that information, outputted in Markdown format.

    If you are unsure and the answer is not explicitly written in the context sections, say "I am sorry, but I don't have access to this information." If you *can* answer the question, give a concise answer. Do NOT waffle around. Context sections: {context} Question: {question} """ PROMPT = PromptTemplate( template=prompt_template, input_variables=["question", "context"] ) chain_type_kwargs = {"prompt": PROMPT} connection = connection_string() embeddings = GooglePalmEmbeddings() if not lazy_vectorstore: lazy_vectorstore = make_vectorstore() Q&A: Retrieve knowledge Inject variables into the prompt
  53. {context} Question: {question} """ PROMPT = PromptTemplate( template=prompt_template, input_variables=["question", "context"]

    ) chain_type_kwargs = {"prompt": PROMPT} connection = connection_string() embeddings = GooglePalmEmbeddings() if not lazy_vectorstore: lazy_vectorstore = make_vectorstore() retriever = lazy_vectorstore.as_retriever() qa = RetrievalQA.from_chain_type(llm=GooglePalm(), chain_type="stuff", retriever=retriever, chain_type_kwargs=chain_type_kwargs) answer = qa.run(query) return answer Q&A: Retrieve knowledge Get the vector store
  54. {context} Question: {question} """ PROMPT = PromptTemplate( template=prompt_template, input_variables=["question", "context"]

    ) chain_type_kwargs = {"prompt": PROMPT} connection = connection_string() embeddings = GooglePalmEmbeddings() if not lazy_vectorstore: lazy_vectorstore = make_vectorstore() retriever = lazy_vectorstore.as_retriever() qa = RetrievalQA.from_chain_type(llm=GooglePalm(), chain_type="stuff", retriever=retriever, chain_type_kwargs=chain_type_kwargs) answer = qa.run(query) return answer Q&A: Retrieve knowledge For retrieving documents
  55. {context} Question: {question} """ PROMPT = PromptTemplate( template=prompt_template, input_variables=["question", "context"]

    ) chain_type_kwargs = {"prompt": PROMPT} connection = connection_string() embeddings = GooglePalmEmbeddings() if not lazy_vectorstore: lazy_vectorstore = make_vectorstore() retriever = lazy_vectorstore.as_retriever() qa = RetrievalQA.from_chain_type(llm=GooglePalm(), chain_type="stuff", retriever=retriever, chain_type_kwargs=chain_type_kwargs) answer = qa.run(query) return answer Q&A: Retrieve knowledge Use LangChain for Q&A
  56. {context} Question: {question} """ PROMPT = PromptTemplate( template=prompt_template, input_variables=["question", "context"]

    ) chain_type_kwargs = {"prompt": PROMPT} connection = connection_string() embeddings = GooglePalmEmbeddings() if not lazy_vectorstore: lazy_vectorstore = make_vectorstore() retriever = lazy_vectorstore.as_retriever() qa = RetrievalQA.from_chain_type(llm=GooglePalm(), chain_type="stuff", retriever=retriever, chain_type_kwargs=chain_type_kwargs) answer = qa.run(query) return answer Q&A: Retrieve knowledge Can be configured using different LLMs and data sources
  57. {context} Question: {question} """ PROMPT = PromptTemplate( template=prompt_template, input_variables=["question", "context"]

    ) chain_type_kwargs = {"prompt": PROMPT} connection = connection_string() embeddings = GooglePalmEmbeddings() if not lazy_vectorstore: lazy_vectorstore = make_vectorstore() retriever = lazy_vectorstore.as_retriever() qa = RetrievalQA.from_chain_type(llm=GooglePalm(), chain_type="stuff", retriever=retriever, chain_type_kwargs=chain_type_kwargs) answer = qa.run(query) return answer Q&A: Retrieve knowledge Run the chain and return the answer
  58. 􀯐 Building an exobrain Architecture Storing cleaned-up a ffl Summarising

    a rt Q&A with your knowledge base 􀆅 􀆅 􀆅 􀆅
  59. 􀯐 Building an exobrain Architecture Storing cleaned-up a ffl Summarising

    a rt Q&A with your knowledge base 􀆅 􀆅 􀆅 􀆅 􀯐 One more thing …
  60. struct AskQuestionIntent: AppIntent { @Injected(\.semanticQAService) private var semanticQAService static let

    title: LocalizedStringResource = "Ask a question" static let description: LocalizedStringResource = "Asks Sofia a question" @Parameter(title: "Question", description: "A question to answer from your knowledge base") var question: String? @MainActor func perform() async throws -> some ProvidesDialog & ShowsSnippetView { guard let providedPhrase = question else { throw $question.needsValueError("Sure - what would you like to know?") } let answer = await semanticQAService.answerQuestion(question: providedPhrase) Siri: App Intents App intent, discovered at compile time
  61. struct AskQuestionIntent: AppIntent { @Injected(\.semanticQAService) private var semanticQAService static let

    title: LocalizedStringResource = "Ask a question" static let description: LocalizedStringResource = "Asks Sofia a question" @Parameter(title: "Question", description: "A question to answer from your knowledge base") var question: String? @MainActor func perform() async throws -> some ProvidesDialog & ShowsSnippetView { guard let providedPhrase = question else { throw $question.needsValueError("Sure - what would you like to know?") } let answer = await semanticQAService.answerQuestion(question: providedPhrase) Siri: App Intents Inject our QA service
  62. struct AskQuestionIntent: AppIntent { @Injected(\.semanticQAService) private var semanticQAService static let

    title: LocalizedStringResource = "Ask a question" static let description: LocalizedStringResource = "Asks Sofia a question" @Parameter(title: "Question", description: "A question to answer from your knowledge base") var question: String? @MainActor func perform() async throws -> some ProvidesDialog & ShowsSnippetView { guard let providedPhrase = question else { throw $question.needsValueError("Sure - what would you like to know?") } let answer = await semanticQAService.answerQuestion(question: providedPhrase) Siri: App Intents Input parameter for the intent
  63. @Parameter(title: "Question", description: "A question to answer from your knowledge

    base") var question: String? @MainActor func perform() async throws -> some ProvidesDialog & ShowsSnippetView { guard let providedPhrase = question else { throw $question.needsValueError("Sure - what would you like to know?") } let answer = await semanticQAService.answerQuestion(question: providedPhrase) return .result(dialog: IntentDialog(stringLiteral: "Here is your answer")) { Markdown(answer) } } } Siri: App Intents Execute the intent
  64. @Parameter(title: "Question", description: "A question to answer from your knowledge

    base") var question: String? @MainActor func perform() async throws -> some ProvidesDialog & ShowsSnippetView { guard let providedPhrase = question else { throw $question.needsValueError("Sure - what would you like to know?") } let answer = await semanticQAService.answerQuestion(question: providedPhrase) return .result(dialog: IntentDialog(stringLiteral: "Here is your answer")) { Markdown(answer) } } } Siri: App Intents No value provided? Just ask!
  65. @Parameter(title: "Question", description: "A question to answer from your knowledge

    base") var question: String? @MainActor func perform() async throws -> some ProvidesDialog & ShowsSnippetView { guard let providedPhrase = question else { throw $question.needsValueError("Sure - what would you like to know?") } let answer = await semanticQAService.answerQuestion(question: providedPhrase) return .result(dialog: IntentDialog(stringLiteral: "Here is your answer")) { Markdown(answer) } } } Siri: App Intents Call backend service
  66. @Parameter(title: "Question", description: "A question to answer from your knowledge

    base") var question: String? @MainActor func perform() async throws -> some ProvidesDialog & ShowsSnippetView { guard let providedPhrase = question else { throw $question.needsValueError("Sure - what would you like to know?") } let answer = await semanticQAService.answerQuestion(question: providedPhrase) return .result(dialog: IntentDialog(stringLiteral: "Here is your answer")) { Markdown(answer) } } } Siri: App Intents Return answer in a dialog
  67. @Parameter(title: "Question", description: "A question to answer from your knowledge

    base") var question: String? @MainActor func perform() async throws -> some ProvidesDialog & ShowsSnippetView { guard let providedPhrase = question else { throw $question.needsValueError("Sure - what would you like to know?") } let answer = await semanticQAService.answerQuestion(question: providedPhrase) return .result(dialog: IntentDialog(stringLiteral: "Here is your answer")) { Markdown(answer) } } } Siri: App Intents gonzalezreal/swift-markdown-ui
  68. How I used Siri, PaLM, LangChain, and Firebase to create

    an Exobrain How I used Siri, PaLM, LangChain, and Firebase to create an Exobrain 􀯐 􀯐