Slide 1

Slide 1 text

HTTP/2 performance and Non-blocking Architecture @julienviet

Slide 2

Slide 2 text

No content

Slide 3

Slide 3 text

No content

Slide 4

Slide 4 text

Latency vs Bandwidth impact on Page Load Time Page Load Time as bandwidth increases 1000 1550 2100 2650 3200 1Mbps 2Mbps 2Mbps 4Mbps 5Mbps 6Mbps 7Mbps 8Mbps 9Mbps 10Mbps Page Load Time as latency decrease 1000 1750 2500 3250 4000 200 ms 180 ms 160 ms 140 ms 120 ms 100 ms 80 ms 60 ms 40 ms 20 ms

Slide 5

Slide 5 text

HTTP/1 in the browser

Slide 6

Slide 6 text

HTTP / TCP impedance mismatch

Slide 7

Slide 7 text

HTTP/2 in the browser

Slide 8

Slide 8 text

HTTP/2 intent Not a new version of the protocol it’s about how it gets onto the wire

Slide 9

Slide 9 text

HTTP/2 brings network sympathy

Slide 10

Slide 10 text

Why HTTP/2 performs better

Slide 11

Slide 11 text

B1n4ry

Slide 12

Slide 12 text

COMPR ESS headers headers headers headers headers headers headers headers headers headers headers headers headers headers headers headers headers headers

Slide 13

Slide 13 text

S-l-i-c-e

Slide 14

Slide 14 text

Priorities

Slide 15

Slide 15 text

PUSH

Slide 16

Slide 16 text

HTTP/2 on the server

Slide 17

Slide 17 text

HTTP/1 vs HTTP/2 benchmark Frontend HTTP/1 vs HTTP/2 Backend Client connection pool 20ms think time

Slide 18

Slide 18 text

Benchmark Pace requests at a given rate Log ratio of requests performed/planned Log response time percentiles https://github.com/vietj/http2-bench

Slide 19

Slide 19 text

HTTP/1 - 8 connections - pipelined

Slide 20

Slide 20 text

What is limiting us ?

Slide 21

Slide 21 text

FIFO one thing at a time!

Slide 22

Slide 22 text

HTTP/2 multiplexing

Slide 23

Slide 23 text

HTTP/2 - 1 connections - concurrency 20

Slide 24

Slide 24 text

Concurrency increased !

Slide 25

Slide 25 text

But…

Slide 26

Slide 26 text

HTTP/2 - 1 connections - concurrency 400

Slide 27

Slide 27 text

Congestion

Slide 28

Slide 28 text

No content

Slide 29

Slide 29 text

Multithreading is an illusion of parallelism

Slide 30

Slide 30 text

Input / Output Streams of byte byte[]

Slide 31

Slide 31 text

However the reality is different

Slide 32

Slide 32 text

In reality we have CPU which have cores and we manipulate network packets

Slide 33

Slide 33 text

We want to make a better usage of our resources

Slide 34

Slide 34 text

How to write programs that are efficient ?

Slide 35

Slide 35 text

The real problem is blocking

Slide 36

Slide 36 text

How to *not* block ?

Slide 37

Slide 37 text

Concurrency with Vert.x

Slide 38

Slide 38 text

http://vertx.io Library for building reactive applications Eclipse project (ASL2/EPL) Java 8 Vert.x Core • core library for building a stack • embeddable (core 1MB) Vert.x stack • coherent set of libraries built on top of Vert.x Core • data access, messaging, metrics, etc…

Slide 39

Slide 39 text

What’s Vert.x ? Inspired from Erlang/OTP and Node Event driven Polyglot Event bus High performances / easy to scale Lightweight / embeddable Clustering / HA

Slide 40

Slide 40 text

Why Vert.x Simple concurrency model Unified async API for IO, filesystem, data access, messaging, … Easy to scale Easy to deploy Coherent stack Provides also an RxJava API

Slide 41

Slide 41 text

Non blocking server public static void main(String [] args) { Vertx vertx = Vertx.vertx(); HttpServer server = vertx.createHttpServer(); server.requestHandler(req -> { req.response() .putHeader(“Content-Type”, “text/plain”) .end(“Hello World”); }); server.listen(8080); }

Slide 42

Slide 42 text

Non blocking client public static void main(String [] args) { HttpClient client = vertx.createHttpClient(); client.getNow(“http: //backend”, resp -> { int status = resp.status(); resp.bodyHandler(body -> { System.out.println(body.length()); }); }); }

Slide 43

Slide 43 text

NB server+client server.requestHandler(req -> { HttpServerResponse resp = req.response(); client.getNow(“http: //backend”, clientResp -> { int code = clientResp.status() resp.setStatus(code); clientResp.bodyHandler(body -> { resp.end(body); }); }); });

Slide 44

Slide 44 text

client.getNow(url, …) resp.setStatus(code) resp.end(body)

Slide 45

Slide 45 text

No content

Slide 46

Slide 46 text

Reactor pattern: Event Loop Single Threaded

Slide 47

Slide 47 text

C10K it’s all about concurrency

Slide 48

Slide 48 text

HTTP/2 non blocking - 1 connection - concurrency 400

Slide 49

Slide 49 text

using… a single core!

Slide 50

Slide 50 text

Multi reactor pattern

Slide 51

Slide 51 text

Load balancing

Slide 52

Slide 52 text

HTTP/2 non blocking - 4 connections - concurrency 200

Slide 53

Slide 53 text

Discussion From network to application Reactive back pressure Going distributed