Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Prompt Engineering Fundamentals

Prompt Engineering Fundamentals

Slides used to record the video for Lesson 4 (Generative AI for Beginners)

Course:
https://aka.ms/genai-beginners

Recording:
https://youtu.be/r2ItK3UMVTk

In this lesson, we learn what Prompt Engineering is, why it matters, and how we can craft more effective prompts for a given model and application objective. We'll understand core concepts and best practices for prompt engineering - and learn about an interactive Jupyter Notebooks "sandbox" environment where we can see these concepts applied to real examples.

By the end of this lesson we will be able to:

Explain what prompt engineering is and why it matters.
Describe the components of a prompt and how they are used.
Learn best practices and techniques for prompt engineering.
Apply learned techniques to real examples, using an OpenAI endpoint.

Nitya Narasimhan, PhD

November 01, 2023
Tweet

More Decks by Nitya Narasimhan, PhD

Other Decks in Technology

Transcript

  1. Prompt Engineering Fundamentals Build your intuition for crafting better prompts

    by iteration & validation. Think art – not science.
  2. What will we be covering in this lesson? Define prompt

    engineering – what is it and why is it important? Know prompt construction – how are they structured and used? Learn prompting best practices - improve quality with iteration. Apply learned techniques to real examples – in our code challenge
  3. Let’s Review – Terminology We’ve Seen So Far Open AI

    GPT-3.5-turbo Prompt: I “instruct” the AI what to do in natural language (text, chat) A type of AI that generates new content, using LLMs, in response to a user “prompt” An AI model trained on massive datasets and specialized to work on natural language tasks Text input to LLM that “instructs” it what to do – sets model weights and influences response Completion: The AI “predicts” what it thinks I need in response (text, code) What is Generative AI What are Large Language Models What are Prompts Optimized for Chat: 4097 Tokens | Trained to Sep 2021
  4. Let’s Review – Our Application Context We’re an Education Startup

    building AI apps for Personalized Learning Collaborating to bring AI Innovation To Education – Educator Blog, Jun 2023 Let’s Review – Our Target Application & Audience Administrator I need an assistant that can help me analyze all the curriculum data – and identify gaps in coverage so I can plan better for next year. Educator I need an assistant can help me generate a lesson plan for a specific topic and target audience - and I need this to be presentation-friendly by default. Student I need a tutor that can coach me on this subject in a way that matches my skill level & learning style – I like examples, and I want hints when I get stuck
  5. How does a Prompt work? LLM sees prompt as sequence

    of tokens … Try it yourself: https://platform.openai.com/tokenizer
  6. How does a Prompt work? Foundation LLMs will simply predict

    the next token The user did not specify any instructions. AI treats this as request for information and predicts response from trained data. Try it yourself:: https://oai.azure.com/portal/chat
  7. How does a Prompt work? Instruction-tuned LLM extends base behavior

    for task Add context with a “system” message that tunes the default LLM behavior. Try it yourself:: https://oai.azure.com/portal/chat
  8. Try it yourself:: https://oai.azure.com/portal/chat How does a Prompt work? Instruction-tuned

    LLM extends base behavior for task Then try the same content snippet again as your user input. Do you see the difference?
  9. Try it yourself:: https://oai.azure.com/portal/chat Definition What is Prompt Engineering? A

    prompt is the text input given to the Generative AI model. Prompt engineering is the process of designing and optimizing the prompt till the response meets the user’s expectations for relevance or quality
  10. Try it yourself:: https://oai.azure.com/portal/chat Motivation What is Prompt Engineering Necessary?

    • LLMs are stochastic in nature since prompts “program” the model, the responses are even more sensitive to construction of prompt. • LLMs can hallucinate responses since pre-trained data has a cutoff date, they may predict completions that are not grounded in fact. • LLMs have diverse capabilities every model has unique features and quirks that can be tuned for better quality responses
  11. Read the Article: https://github.blog/2023-06-20-how-to-write-better-prompts-for-github-copilot/ Case Study Real-World Usage in GitHub

    Copilot “For example, when asking GitHub Copilot to draw an ice cream cone using p5.js, a JavaScript library for creative coding, we kept receiving irrelevant suggestions—or sometimes no suggestions at all.” “Draw an ice-cream cone with ice cream using p5.js”
  12. Read the Article: https://github.blog/2023-06-20-how-to-write-better-prompts-for-github-copilot/ Case Study How is Prompt Engineering

    Used In Real Apps? “When we adjusted our prompt, we were able to generate more accurate results” “3 best practices for prompt crafting - set the stage with high-level goal, be simple & specific, give examples” Draw an ice-cream cone with an ice cream scoop and a cherry on top. The ice cream cone will be a triangle with the point facing down .. The ice cream scoop will be a half circle on top of the cone … …
  13. Simple: Content Type in text See completion In the simplest

    form, you provide a text input (prompt) and the model predicts the next tokens (completion) This is the fundamental behavior any LLM. Remember it. Try it yourself:: https://oai.azure.com/portal/chat
  14. Complex: Content Multi-turn conversation with context Prompt text as ‘messages’

    that carry system context (priming) and user/assistant exchanges (history) as sliding token window Try it yourself: https://platform.openai.com/examples/default-marv-sarcastic-chat
  15. Simple: Instruction Text input specifies task, not content Write a

    description of the civil war. Try it yourself:: https://oai.azure.com/portal/chat
  16. Complex: Instruction Text input specifies task, adds details Write a

    description of the civil war. Provide key dates and events and describe their significance. Try it yourself:: https://oai.azure.com/portal/chat
  17. Complex, Formatted: Instruction Text input specifies task, adds details, defines

    format Write a description of the Civil War in 1 paragraph. Provide 3 bullet points with key dates and their significance. Provide 3 more bullet points with key historical figures and their contributions. Return the output as a JSON file Try it yourself:: https://oai.azure.com/portal/chat
  18. Construct: Instruction + Primary Content Specify task and provide content

    as context for it Jupiter is the fifth planet from the Sun and the largest in the Solar System. It is a gas giant with a mass one-thousandth that of the Sun, but two-and-a-ha … Summarize this in 2 short sentences Try it yourself:: https://oai.azure.com/portal/chat
  19. Primary Content: Provide Examples Examples as in- context guidance Model

    reacts to examples, infers response patterns Zero-shot prompting User provides 1 instruction (explicit) + 0 examples Try it yourself:: https://oai.azure.com/portal/chat
  20. Primary Content: Provide Examples Examples as in- context guidance Model

    reacts to examples, infers response patterns One-shot prompting User provides 0 instructions + 1 example (implicit) Try it yourself:: https://oai.azure.com/portal/chat
  21. Few-shot prompting User provides 0 instructions + few examples (improve

    inference) Try it yourself:: https://oai.azure.com/portal/chat Primary Content: Provide Examples Examples as in- context guidance Model reacts to examples, infers response patterns
  22. Try it yourself:: https://oai.azure.com/portal/chat Primary Content: Provide Cues Cues ‘prime’

    the response Model ‘takes the cue’ to favor completions that match pattern 0 cues
  23. Try it yourself:: https://oai.azure.com/portal/chat Primary Content: Provide Cues Cues ‘prime’

    the response Model ‘takes the cue’ to favor completions that match pattern 1 cue
  24. Try it yourself:: https://oai.azure.com/portal/chat Primary Content: Provide Cues Cues ‘prime’

    the response Model ‘takes the cue’ to favor completions that match pattern 2 cues
  25. Try it yourself:: https://oai.azure.com/portal/chat Primary Content: Use Templates Pre-Defined “Recipes”

    for Repeatability Users can “reuse” core templates from an application UI System message templates can set “personality” consistently
  26. Read About It : Prompt Engineering and LLMs with Langchain

    Primary Content: Use Templates Pre-Defined “Recipes” for Repeatability Apps can customize & use templates with APIs for automation at scale User message templates with placeholders for dynamic input binding
  27. Prompt Engineering Mindset Prompt Engineering is a trial-and-error process that

    also relies on your intuition. 1. Build domain understanding – use your domain expertise to customize prompt for relevance. 2. Build model understanding – adapt prompt to suit model strengths & weaknesses to improve quality. 3. Iterate & Validate – define acceptance or termination criteria so you iterate to meet expectations but don’t over-engineer the prompt to reduce reusability
  28. Prompt Engineering Best Practices 1. Evaluate the latest models 2.

    Separate instructions & context 3. Be specific & clear 4. Be descriptive & use examples 5. Use cues - think ‘priming’ responses 6. Double down – think reinforcement 7. Order matters – think recency bias 8. Give the model an out – think fallback
  29. Code Challenge – Build Intuition With Trial & Error Set

    model and parameters {text} used as a prompt template variable for consistency.& reuse Prompt template example Prompt request made on model Run in container Jupyter Notebook Python 3.10 kernet (Code cell) Actual execution output (Markdown cell) Sample response
  30. What We Learned Today Prompts help you “program” Gen AI

    What is prompt engineering? Why should we care? How are prompts constructed? How can we optimize prompts? Art, not science Code Challenges
  31. Prompt Engineering Fundamentals Build your intuition for crafting better prompts

    by iteration & validation. Think art – not science.