Slide 1

Slide 1 text

Sharing is Caring Bastian Grimm, Peak Ace AG | @basgr Top Prompts for LLMs

Slide 2

Slide 2 text

A whole load of tips and tiny text are squeezed into jam-packed PDFs, barely readable. My timelines are full of ChatGPT cheat sheets!

Slide 3

Slide 3 text

Everyone wants them. So do I!

Slide 4

Slide 4 text

The reality though: these PDFs are rotting somewhere on my hard drive, collecting virtual dust. Sadly, most of them are garbage

Slide 5

Slide 5 text

#1 Ask ChatGTP

Slide 6

Slide 6 text

6 peakace.agency ChatGPT doesn‘t ask questions – it fills gaps with assumptions In contrast, any human being would ask questions if they didn't understand something You: I need your help creating regular expressions to build URL redirects. What information do you need from me?

Slide 7

Slide 7 text

7 peakace.agency Always double check if that’s REALLY everything… Double checking usually produces a good chunk of additional ideas and thoughts to consider as prompt input: You: Do you need any further information? Are you sure this is all you need?

Slide 8

Slide 8 text

8 peakace.agency Ditch what you don‘t need and let ChatGPT create a prompt template for you You: Please convert this into a prompt template and mark any placeholders I need to fill in with brackets. Remove items 6 (performance) and 7 (testing).

Slide 9

Slide 9 text

I find this extremely helpful for building prompts which are less prone to errors and actually comprehensive.

Slide 10

Slide 10 text

#2 The right framework

Slide 11

Slide 11 text

11 peakace.agency Depending on your goal, pick your framework wisely Hat tip to Dr Marcell Vollmer who shared this visual on LinkedIn outlining different strategies for crafting effective prompts that target specific outcomes by emphasising the roles, tasks, and desired results. Source: https://pa.ag/3T2kMQ8 A structured approach to formulating prompts using different frameworks, each designed to optimise the interaction for specific outcomes.

Slide 12

Slide 12 text

12 peakace.agency Some of my core ChatGPT/LLM use cases right now They all come with a somewhat different prompt syntax: Do research Create outlines Create summaries Validate [things] Ideation & concepts Speed-up learning Simplify content Write code

Slide 13

Slide 13 text

13 peakace.agency Writing code such as complex Regular Expressions (RegEx) RegEx query filters in Google Search Console are extremely handy and powerful, but a pain to create by hand:

Slide 14

Slide 14 text

14 peakace.agency For (complex) data, drag’n’drop CSV and explain columns E.g., upload Sistrix data export, explain columns and get a quick first overview of untapped potentials: Convenience = on-the-fly fixes […] data from the spreadsheet appears to be incorrectly delimited, resulting in a single column containing all data […] it has been successfully reformatted, revealing several columns […]

Slide 15

Slide 15 text

#3 Custom GPTs

Slide 16

Slide 16 text

16 peakace.agency Why use Custom GPTs? Just ask ChatGPT: They offer more "everything" (context awareness, consistency, fine-tuning, …)

Slide 17

Slide 17 text

17 peakace.agency A Custom GPT in its simplest form: Using Peak Ace’s Structured Data GPT to debug and fix errors in JSON-LD mark-up Source: https://pa.ag/structured-data

Slide 18

Slide 18 text

18 peakace.agency You could prompt a Custom GPT the same way… Technically, however, you need fewer details (per prompt), such as a specific context, as you have already provided these details when creating/training/setting up the Custom GPT:

Slide 19

Slide 19 text

19 peakace.agency Pimp your GPT: Integrating 3rd party data to make it smarter A custom GPT, linked to the DataForSEO API to provide real-time access to the latest search volume data:

Slide 20

Slide 20 text

#Bonus Temperature Settings

Slide 21

Slide 21 text

Temperature is used to control the randomness of output. When you set it higher, you'll get more random outputs and when you set it lower the values are more deterministic. Understanding OpenAI's Temperature

Slide 22

Slide 22 text

It’s a number between 0 and 2, with a default value of 0.7 or 1.0 For code generation or data analysis, go low to 0.2 - 0.3, for chatbots 0.5 and for creative writing 0.7 or 0.8: Understanding OpenAI's Temperature

Slide 23

Slide 23 text

The top_p parameter, also known as nucleus sampling, is another setting to control diversity of the generated text. Whatever you do, don’t use both at the same time… top_p as an alternative to temperature sampling

Slide 24

Slide 24 text

Either through the OpenAI API or by using OpenAI Playground. How?

Slide 25

Slide 25 text

That’s all I‘ve got for today. #kthxbye