Slide 1

Slide 1 text

Hidden Traps with Robots.txt and GSC Gianna Brachetti-Truskawa DEEPL SE Speakerdeck.com/giannabrachetti @gianna-brachetti-truskawa @tentaclequing

Slide 2

Slide 2 text

4 What‘s the fastest way to get your domain dexindexed? BrightonSEO | @gianna-brachetti-truskawa

Slide 3

Slide 3 text

5 Have your robots.txt respond with a 5xx! 5 BrightonSEO | @gianna-brachetti-truskawa

Slide 4

Slide 4 text

6 Today, I want you to look beyond syntax … 6 BrightonSEO | @gianna-brachetti-truskawa

Slide 5

Slide 5 text

7 Sometimes it‘s not what‘s in your robots.txt 7 BrightonSEO | @gianna-brachetti-truskawa

Slide 6

Slide 6 text

8 … but what‘s outside of it 8 BrightonSEO | @gianna-brachetti-truskawa

Slide 7

Slide 7 text

9 What you think it is: • A security guard • A law bots will have to obey • A tool to prevent testing environments from being accessible via the internet 9 BrightonSEO | @gianna-brachetti-truskawa

Slide 8

Slide 8 text

10 What it really is: • An optional set of directives that crawlers can use to save on resources for crawl efficiency • A security risk 10 BrightonSEO | @gianna-brachetti-truskawa

Slide 9

Slide 9 text

11

Slide 10

Slide 10 text

12 And it can cost you a lot of money

Slide 11

Slide 11 text

13 1. What if you don‘t have a robots.txt? 2. What if you have one but it becomes unavailable? 3. Why can‘t you use it as a security guard? 4. What can you do to save yourself trouble? 13 BrightonSEO | @gianna-brachetti-truskawa What we‘re going to do today:

Slide 12

Slide 12 text

14 Without a robots.txt, crawlers assume they can access all 14 BrightonSEO | @gianna-brachetti-truskawa

Slide 13

Slide 13 text

15 Fine for small sites – for large sites, it helps to manage crawl budget more efficiently 15 BrightonSEO | @gianna-brachetti-truskawa

Slide 14

Slide 14 text

16 But how would I reference my XML sitemap? 16 BrightonSEO | @gianna-brachetti-truskawa

Slide 15

Slide 15 text

17 You‘re better off keeping it a secret! 17 BrightonSEO | @gianna-brachetti-truskawa

Slide 16

Slide 16 text

Vulnerability reduction 18 Why would you hide your XML sitemap? Crawl control 18 BrightonSEO | @gianna-brachetti-truskawa

Slide 17

Slide 17 text

19 Why would you hide your XML sitemap? Control who accesses your XML sitemap Use case: • Competitive intelligence in eCommerce 19 BrightonSEO | @gianna-brachetti-truskawa

Slide 18

Slide 18 text

20 Why would you hide your XML sitemap? Make it harder to use the sitemap as a vector for attacks Use cases: • Scraping • (soft) DDoS attacks 20 BrightonSEO | @gianna-brachetti-truskawa

Slide 19

Slide 19 text

21 What if your robots.txt becomes unavailable?

Slide 20

Slide 20 text

22 22 BrightonSEO | @gianna-brachetti-truskawa Let‘s go back to this shocker:

Slide 21

Slide 21 text

23 The devil is in the details documentation 23 BrightonSEO | @gianna-brachetti-truskawa

Slide 22

Slide 22 text

Search Central Documentation: How Google interpretes the robots.txt specification² 24 Documentation Specified Robots.txt Web Standards and Google Documentation RFC 9309: Robots Exclusion Protocol (REP)1 Search Console Help: robots.txt report³ 1 https://www.rfc-editor.org/rfc/rfc9309.html 2 https://developers.google.com/search/docs/crawling-indexing/robots/robots_txt 3 https://support.google.com/webmasters/answer/6062598?hl=en#crawl_errors Last access 30.09.2024

Slide 23

Slide 23 text

25 25 BrightonSEO | @gianna-brachetti-truskawa Documentation Specified Robots.txt Web Standards and Google Documentation

Slide 24

Slide 24 text

26 Google may cache your robots.txt for up to 24h 26 BrightonSEO | @gianna-brachetti-truskawa

Slide 25

Slide 25 text

27 … or shorter if you limit caching 27 BrightonSEO | @gianna-brachetti-truskawa

Slide 26

Slide 26 text

28 If you make changes, it can still take up to 24h until they‘re fetched 28 BrightonSEO | @gianna-brachetti-truskawa

Slide 27

Slide 27 text

29 Robots.txt redirect chains can lead to it being ignored 29 BrightonSEO | @gianna-brachetti-truskawa

Slide 28

Slide 28 text

• For all 4xx but 429 • Treat as allow all 429 (Too many requests) will be treated as 5xx 30 Client Errors 4xx What happens if Google cannot fetch your robots.txt anymore? • Status codes 400 – 499 • MAY be treated as allow all • First 12h: Stops crawling domain • <= 30 days: use last cached version • > 30 days: Check if the site is available in general, treat as allow all Confidential REP Search Console Help Search Central Documentation 30 BrightonSEO | @gianna-brachetti-truskawa

Slide 29

Slide 29 text

If you have time- sensitive info, it might not be fetched on time 31 31 BrightonSEO | @gianna-brachetti-truskawa

Slide 30

Slide 30 text

32 A 429 can lead to your domain being deindexed! 32 BrightonSEO | @gianna-brachetti-truskawa

Slide 31

Slide 31 text

33 Server Errors 5xx What happens if Google cannot fetch your robots.txt anymore? • Treat as complete disallow • > 30 days: preferably use last cached version unless unavailable – else treat as 4xx (= allow all) • Treat 4xx and 5xx all the same: allow all REP Search Console Help Search Central Documentation 33 BrightonSEO | @gianna-brachetti-truskawa

Slide 32

Slide 32 text

34 Server Errors 5xx Contradictions between known sources REP Search Console Help Search Central Documentation Gary Illyes 34 BrightonSEO | @gianna-brachetti-truskawa DISALLOW ALL ALLOW ALL DEINDEX ALL

Slide 33

Slide 33 text

35 DNS errors, connection timeouts will be treated the same! 35 BrightonSEO | @gianna-brachetti-truskawa

Slide 34

Slide 34 text

36 So which is it now, Google? 36 BrightonSEO | @gianna-brachetti-truskawa

Slide 35

Slide 35 text

37 Why you can‘t use robots.txt as a security guard

Slide 36

Slide 36 text

38 Error codes can lead to your robots.txt become a liability 38 BrightonSEO | @gianna-brachetti-truskawa

Slide 37

Slide 37 text

39 … esp. if you 39 BrightonSEO | @gianna-brachetti-truskawa to hide secrets used it

Slide 38

Slide 38 text

40 Robots.txt might tell us where the most interesting files might be 40 BrightonSEO | @gianna-brachetti-truskawa

Slide 39

Slide 39 text

You’re relying on your robot.txt to keep things out of the internet 41 How your robots.txt can become a liability (even if it‘s a 200!) You’re exposing vulnerabilities of your website or servers You’re not monitoring the content or uptime of your robots.txt 41 BrightonSEO | @gianna-brachetti-truskawa

Slide 40

Slide 40 text

42 Exposing sensitive information can become an expensive GDPR issue 42 BrightonSEO | @gianna-brachetti-truskawa

Slide 41

Slide 41 text

43 Robots.txt emerged as a practical solution to real-world problems in the early web 43 BrightonSEO | @gianna-brachetti-truskawa

Slide 42

Slide 42 text

44 But … Robots.txt is not legally binding! 44 BrightonSEO | @gianna-brachetti-truskawa

Slide 43

Slide 43 text

45 Google may ignore it in some cases 45 BrightonSEO | @gianna-brachetti-truskawa

Slide 44

Slide 44 text

46 46 BrightonSEO | @gianna-brachetti-truskawa Example Click tracking via parameters in URLs in prominent places https://www.deepl.com/pro?cta=header-prices

Slide 45

Slide 45 text

47 47 BrightonSEO | @gianna-brachetti-truskawa

Slide 46

Slide 46 text

48 If in doubt, Google may choose indexation over restriction 48 BrightonSEO | @gianna-brachetti-truskawa

Slide 47

Slide 47 text

49 Best practices to avoid risks

Slide 48

Slide 48 text

50 Limit access with more reliable methods! 50 BrightonSEO | @gianna-brachetti-truskawa

Slide 49

Slide 49 text

51 Want to remove leaked data from the web? 51 BrightonSEO | @gianna-brachetti-truskawa

Slide 50

Slide 50 text

52 Remove leaked data from the web: 410 (gone) X-Robots-Tag: none 52 BrightonSEO | @gianna-brachetti-truskawa

Slide 51

Slide 51 text

53 Want to protect data from leaking into the web? 53 BrightonSEO | @gianna-brachetti-truskawa

Slide 52

Slide 52 text

54 Protect data from the web: HTTP Authentication X-Robots-Tag: none Avoid internal links 54 BrightonSEO | @gianna-brachetti-truskawa

Slide 53

Slide 53 text

55 Want to make it hard to spy on you or train genAI with your content? 55 BrightonSEO | @gianna-brachetti-truskawa

Slide 54

Slide 54 text

56 Rate limiting Path obfuscation Serve different versions 56 BrightonSEO | @gianna-brachetti-truskawa

Slide 55

Slide 55 text

57 Example by @pandraus, 03.07.2024 57 BrightonSEO | @gianna-brachetti-truskawa

Slide 56

Slide 56 text

58 Monitor content and uptime! 58 BrightonSEO | @gianna-brachetti-truskawa

Slide 57

Slide 57 text

59 GSC shows you versions and status of your robots.txt 59 BrightonSEO | @gianna-brachetti-truskawa

Slide 58

Slide 58 text

60 Uptime / Status Code Monitoring 60 BrightonSEO | @gianna-brachetti-truskawa

Slide 59

Slide 59 text

61 Monitor Search Console error classes 61 BrightonSEO | @gianna-brachetti-truskawa

Slide 60

Slide 60 text

62 Set filters for WNC- and the error classes most relevant for you 62 BrightonSEO | @gianna-brachetti-truskawa

Slide 61

Slide 61 text

63 If you do want to test syntax, use a parser, eg. by Will Critchlow: www.realrobotstxt.com 63 BrightonSEO | @gianna-brachetti-truskawa

Slide 62

Slide 62 text

64 Go multi-level to successfully control which content can be accessed by AI crawlers 64 BrightonSEO | @gianna-brachetti-truskawa

Slide 63

Slide 63 text

65 65 • Multilingual Tech SEO Strategist of 15 years • Turned PM Driving Growth in Global Markets • Ask me about complex tech issues, B2B SaaS SEO, and… • … my paper for the IAB Workshop on AI Control. Current role: PM Search at DeepL 65 BrightonSEO | @gianna-brachetti-truskawa

Slide 64

Slide 64 text

Thank you! 66

Slide 65

Slide 65 text

67 Read on here: https://www.robotstxt.org/faq/legal.html

Slide 66

Slide 66 text

68 How your robots.txt can become a liability You’re exposing vulnerabilities of your website or servers • Outdated folders with sensitive data • Server vulnerabilities (Apache server status) • Admin login paths in your CMS • Internal services 68 BrightonSEO | @gianna-brachetti-truskawa

Slide 67

Slide 67 text

69 How your robots.txt can become a liability You’re relying on your robot.txt to keep things out of the internet • Mitigate scraping attacks • Restrict access to personal data • Avoid duplicate content issues (why not target the root cause?) • Get things out of the index you accidentally leaked 69 BrightonSEO | @gianna-brachetti-truskawa

Slide 68

Slide 68 text

70 How your robots.txt can become a liability You’re not monitoring the content or uptime of your robots.txt • Inhouse teams make changes without aligning with you • CMS plugins can change it without that being their purpose / disclosure in their release notes 70 BrightonSEO | @gianna-brachetti-truskawa