Slide 1

Slide 1 text

Fetch Web Streams @kidwm JSDC 2017 1

Slide 2

Slide 2 text

2 Streams

Slide 3

Slide 3 text

Withoutstreaming Process Render Fetch Process Render Fetch Withstreaming https://jakearchibald.com/2016/streams-ftw/ 3

Slide 4

Slide 4 text

Node.JS Stream A stream is an abstract interface for working with streaming data in Node.js. Streams can be readable, writable, or both. All streams are instances of EventEmitter. Buffer: handle pieces of data I/O data: consume less memory, save more resources Gulp uses streams ” 4

Slide 5

Slide 5 text

Types of Node.JS Streams Source => Buffer => Sink • Readable • fs.createReadStream() • Writable • fs.createWriteStream() • Duplex • net.Socket • Transform • zlib.createDeflate() 5

Slide 6

Slide 6 text

chunked Express response Stream response.writeHead(200, { 'content-type': 'text/html', 'transfer-encoding': ' ' }); response.write('

foo

\n'); // multiple times 6

Slide 7

Slide 7 text

ReactDOMServer.renderToNodeStream pipe(res) React 16 Server Side Rendering app.get('/', (req, res) => { ( ). ; }); 7

Slide 8

Slide 8 text

8 Web Streams

Slide 9

Slide 9 text

Web Streams This specification provides APIs for creating, composing, and consuming streams of data that map efficiently to low-level I/O primitives. only two types of streams , readable and writable streams https://streams.spec.whatwg.org/ ” 9

Slide 10

Slide 10 text

start pull cancel ReadableStream const readableStream = new ReadableStream({ (controller) {}, (controller) {},// called when stream's buffer isn't full (reason) {} }, queuingStrategy); console.log(readableStream.constructor.prototype); 10

Slide 11

Slide 11 text

ReadableStreamDefaultController • controller.enqueue(chunk) // queue data in buffer • controller.close() • controller.error(e) • controller.desiredSize // the amount of buffer remaining https://jsbin.com/fahavoz/edit?js,console 11

Slide 12

Slide 12 text

push source const readableStream = new ReadableStream({ start(controller) { controller.enqueue(1); controller.enqueue(2); controller.close(); } }); 12

Slide 13

Slide 13 text

pull source const readableStream = new ReadableStream({ pull(controller) { controller.enqueue(1); controller.enqueue(2); controller.close(); } }); 13

Slide 14

Slide 14 text

getReader() ReadableStreamDefaultReader const reader = readableStream. ; reader.closed.then(() => {console.log('reader closed');}); console.log(await reader.read()); // 1 console.log(await reader.read()); // 2 console.log(await reader.read()); // done • reader.cancel() • reader.releaseLock() 14

Slide 15

Slide 15 text

locked releaseLock() locked const readableStream = new ReadableStream(); console.log(readableStream. ); // false const reader = readableStream.getReader(); console.log(readableStream.locked); // true reader. ; console.log(readableStream.locked); // false 15

Slide 16

Slide 16 text

tee To fork a stream is to obtain two new readable streams const teed = stream.tee(); console.log(teed[0]); console.log(teed[1]); 16

Slide 17

Slide 17 text

start write abort close WritableStream const writableStream = new WritableStream({ (controller) {}, // controller.error() (chunk, controller) {}, (reason) {}, (controller) {} }, queuingStrategy); 17

Slide 18

Slide 18 text

getWriter() WritableStreamDefaultWriter const defaultWriter = writableStream. ; • defaultWriter.abort() • defaultWriter.close() • defaultWriter.releaseLock() • defaultWriter.write() • defaultWriter.desiredSize • defaultWriter.ready.then() • defaultWriter.closed.then() 18

Slide 19

Slide 19 text

queueingStrategy • CountQueueingStrategy • ByteLengthQueueingStrategy new CountQueuingStrategy({ highWaterMark: 1 }) default used is a CountQueuingStrategy with a high water mark of 1 19

Slide 20

Slide 20 text

desiredSize await desiredSize const strategy = new CountQueuingStrategy({highWaterMark: 2}); const writer = (new WritableStream({}, strategy)).getWriter(); console.log(writer. ); // 2 writer.write('1'); console.log(writer.desiredSize); // 1 writer.write('1'); writer.write('1'); // desiredSize = 0 writer.write('1'); // Notice the await keyword console.log(writer.desiredSize); // 2 20

Slide 21

Slide 21 text

pipe Chaining aync sequence • stream. pipeTo(writable) • stream. pipeThrough(transform) await src.pipeThrough(through).pipeTo(dest); 21

Slide 22

Slide 22 text

backpressure Flow control • desiredSize • can be negative, if the queue is over-full • writer.ready • Returns a Promise that resolves when the desired size of the stream's internal queue transitions from non-positive to positive, signaling that it is no longer applying backpressure. 22

Slide 23

Slide 23 text

23 Fetch Web Streams

Slide 24

Slide 24 text

getReader() read() value Uint8Array done Response fetch(url).then(response => { // response.body is stream const reader = response.body. ; return reader. .then(function processResult(result) { console.log(result. ); // chunk content, if (result. ) return console.log("Fetch complete"); return reader.read().then(processResult); // read more }} http://jsbin.com/vuqasa/edit?js,console 24

Slide 25

Slide 25 text

Cancelling A stream can be cancelled using stream.cancel() • response.body.cancel() • reader.cancel() https://jsbin.com/gameboy/edit?js,console 25

Slide 26

Slide 26 text

'fetch' respondWith Service Worker self.addEventListener( , event => { event. (new Response(stream, { headers: {'Content-Type': 'text/html'} }));}) https://glitch.com/edit/#!/html-sw-stream?path=public/sw.js:1:0 https://jakearchibald.github.io/isserviceworkerready/demos/simple-stream/ 26

Slide 27

Slide 27 text

async for await chunk of response.body Async iterators function getResponseSize(url) { const response = await fetch(url); let total = 0; (const ) { total += chunk.length; } return total;} 27

Slide 28

Slide 28 text

response.headers.get('Content-Length') value: {length} chunk.value.length / parseInt(contentLength, 10) Downloading progress const response = await fetch(url); const reader = response.body.getReader(); const contentLength = ; const { = {}, done} = await reader.read(); 28

Slide 29

Slide 29 text

Other possible applications 1. Gzip/deflate 2. Audio/video/image codecs 3. CSV-to-JSON converter 4. The streaming HTML/XML parser 29

Slide 30

Slide 30 text

Browsers Support • ReadableStream / WritableStream • Firefox 57+ (not enabled yet) • Service Worker respondWith stream • Firefox 58+ • Async iterators: • Chrome 63+ / Firefox 57+ 30

Slide 31

Slide 31 text

Recap • Streams handle sources of data in a memory efficient way • Web Streams is available with Fetch API • Handle Web Stream in Service Worker • Better performance for Progressive Web App 31

Slide 32

Slide 32 text

Thanks! @kidwm Ref: • 2016 - the year of web streams by Jake Archibald • A Guide to Faster Web App I/O and Data Operations with Streams by Umar Hansa 32