Slide 1

Slide 1 text

Doctrine Lexer use case

Slide 2

Slide 2 text

No content

Slide 3

Slide 3 text

Olivier Dolbeau @odolbeau Web Architect About me

Slide 4

Slide 4 text

RIDESHARING Cost of motoring 100€ 25€ 25€ 25€ 25€ @BlaBlaCar_FR ALONE

Slide 5

Slide 5 text

A fast growing community

Slide 6

Slide 6 text

A European phenomenon

Slide 7

Slide 7 text

No content

Slide 8

Slide 8 text

I want to be able to search tweets using a human language.

Slide 9

Slide 9 text

No content

Slide 10

Slide 10 text

No content

Slide 11

Slide 11 text

No content

Slide 12

Slide 12 text

No content

Slide 13

Slide 13 text

Where are my tweets ?

Slide 14

Slide 14 text

No content

Slide 15

Slide 15 text

No content

Slide 16

Slide 16 text

No content

Slide 17

Slide 17 text

No content

Slide 18

Slide 18 text

No content

Slide 19

Slide 19 text

No content

Slide 20

Slide 20 text

Other solutions Logstash Home made external river

Slide 21

Slide 21 text

distributed restful search and analytics elasticsearch

Slide 22

Slide 22 text

No content

Slide 23

Slide 23 text

No content

Slide 24

Slide 24 text

No content

Slide 25

Slide 25 text

No content

Slide 26

Slide 26 text

No content

Slide 27

Slide 27 text

We know what we want. We know where our data is.

Slide 28

Slide 28 text

Now let’s see HOW we are doing to retrieve it.

Slide 29

Slide 29 text

How... … to translate a human language into a technical one.

Slide 30

Slide 30 text

“ A retweet FROM @CameronNash23 ”

Slide 31

Slide 31 text

No content

Slide 32

Slide 32 text

What’s a Lexer ? In computer science, lexical analysis is the process of converting a sequence of characters into a sequence of tokens, i. e. meaningful character strings.

Slide 33

Slide 33 text

Tokens ! Now !

Slide 34

Slide 34 text

No content

Slide 35

Slide 35 text

No content

Slide 36

Slide 36 text

No content

Slide 37

Slide 37 text

No content

Slide 38

Slide 38 text

“ A retweet FROM @CameronNash23 ” T_SELECTOR T_RETWEET T_FROM T_USERNAME

Slide 39

Slide 39 text

How... … to use these tokens to generate an ElasticSearch query.

Slide 40

Slide 40 text

No content

Slide 41

Slide 41 text

No content

Slide 42

Slide 42 text

No content

Slide 43

Slide 43 text

No content

Slide 44

Slide 44 text

It works!

Slide 45

Slide 45 text

No content

Slide 46

Slide 46 text

No content

Slide 47

Slide 47 text

No content

Slide 48

Slide 48 text

And what about Doctrine ORM ? Let’s take a look at : Doctrine\ORM\Query\Lexer

Slide 49

Slide 49 text

From DQL to SQL SELECT t0.id AS id1, t0.name AS name3, t0. username AS username4, t0.email AS email5 FROM mos_users t0 WHERE t0.username = “foobar” SELECT u FROM MyProject\Model\User u WHERE u.username = ‘foobar’

Slide 50

Slide 50 text

77 tokens !!

Slide 51

Slide 51 text

Some patterns...

Slide 52

Slide 52 text

And a switch (true) !

Slide 53

Slide 53 text

Forget the Parser !

Slide 54

Slide 54 text

No content

Slide 55

Slide 55 text

No content

Slide 56

Slide 56 text

No content

Slide 57

Slide 57 text

Example A trip OR A search FROM "Paris, France" OR TO "Paris, France" BETWEEN 12/01/2013 AND 26/05/2014 A trip OR A search FROM "Paris, France" OR TO "Paris, France" BETWEEN 12/01/2013 AND 26/05/2014

Slide 58

Slide 58 text

Lexer to GET tokens. Parser to USE tokens.

Slide 59

Slide 59 text

@odolbeau On recrute ! :) MERCI !

Slide 60

Slide 60 text

@odolbeau https://joind.in/ 11246 https://github.com/odolbeau/elasticsearch-sandbox https://speakerdeck.com/odolbeau/doctrine-lexer-use-case