Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Fetch Web Streams

75307af344414a724528f5ba3920d01c?s=47 WM
November 04, 2017

Fetch Web Streams

JSDC 2017

75307af344414a724528f5ba3920d01c?s=128

WM

November 04, 2017
Tweet

Transcript

  1. Fetch Web Streams @kidwm JSDC 2017 1

  2. 2 Streams

  3. Withoutstreaming Process Render Fetch Process Render Fetch Withstreaming https://jakearchibald.com/2016/streams-ftw/ 3

  4. Node.JS Stream A stream is an abstract interface for working

    with streaming data in Node.js. Streams can be readable, writable, or both. All streams are instances of EventEmitter. Buffer: handle pieces of data I/O data: consume less memory, save more resources Gulp uses streams ” 4
  5. Types of Node.JS Streams Source => Buffer => Sink •

    Readable • fs.createReadStream() • Writable • fs.createWriteStream() • Duplex • net.Socket • Transform • zlib.createDeflate() 5
  6. chunked Express response Stream response.writeHead(200, { 'content-type': 'text/html', 'transfer-encoding': '

    ' }); response.write('<p>foo</p>\n'); // multiple times 6
  7. ReactDOMServer.renderToNodeStream pipe(res) React 16 Server Side Rendering app.get('/', (req, res)

    => { ( <Html initialData={JSON.stringify(initialData)}> <App /> </Html> ). ; }); 7
  8. 8 Web Streams

  9. Web Streams This specification provides APIs for creating, composing, and

    consuming streams of data that map efficiently to low-level I/O primitives. only two types of streams , readable and writable streams https://streams.spec.whatwg.org/ ” 9
  10. start pull cancel ReadableStream const readableStream = new ReadableStream({ (controller)

    {}, (controller) {},// called when stream's buffer isn't full (reason) {} }, queuingStrategy); console.log(readableStream.constructor.prototype); 10
  11. ReadableStreamDefaultController • controller.enqueue(chunk) // queue data in buffer • controller.close()

    • controller.error(e) • controller.desiredSize // the amount of buffer remaining https://jsbin.com/fahavoz/edit?js,console 11
  12. push source const readableStream = new ReadableStream({ start(controller) { controller.enqueue(1);

    controller.enqueue(2); controller.close(); } }); 12
  13. pull source const readableStream = new ReadableStream({ pull(controller) { controller.enqueue(1);

    controller.enqueue(2); controller.close(); } }); 13
  14. getReader() ReadableStreamDefaultReader const reader = readableStream. ; reader.closed.then(() => {console.log('reader

    closed');}); console.log(await reader.read()); // 1 console.log(await reader.read()); // 2 console.log(await reader.read()); // done • reader.cancel() • reader.releaseLock() 14
  15. locked releaseLock() locked const readableStream = new ReadableStream(); console.log(readableStream. );

    // false const reader = readableStream.getReader(); console.log(readableStream.locked); // true reader. ; console.log(readableStream.locked); // false 15
  16. tee To fork a stream is to obtain two new

    readable streams const teed = stream.tee(); console.log(teed[0]); console.log(teed[1]); 16
  17. start write abort close WritableStream const writableStream = new WritableStream({

    (controller) {}, // controller.error() (chunk, controller) {}, (reason) {}, (controller) {} }, queuingStrategy); 17
  18. getWriter() WritableStreamDefaultWriter const defaultWriter = writableStream. ; • defaultWriter.abort() •

    defaultWriter.close() • defaultWriter.releaseLock() • defaultWriter.write() • defaultWriter.desiredSize • defaultWriter.ready.then() • defaultWriter.closed.then() 18
  19. queueingStrategy • CountQueueingStrategy • ByteLengthQueueingStrategy new CountQueuingStrategy({ highWaterMark: 1 })

    default used is a CountQueuingStrategy with a high water mark of 1 19
  20. desiredSize await desiredSize const strategy = new CountQueuingStrategy({highWaterMark: 2}); const

    writer = (new WritableStream({}, strategy)).getWriter(); console.log(writer. ); // 2 writer.write('1'); console.log(writer.desiredSize); // 1 writer.write('1'); writer.write('1'); // desiredSize = 0 writer.write('1'); // Notice the await keyword console.log(writer.desiredSize); // 2 20
  21. pipe Chaining aync sequence • stream. pipeTo(writable) • stream. pipeThrough(transform)

    await src.pipeThrough(through).pipeTo(dest); 21
  22. backpressure Flow control • desiredSize • can be negative, if

    the queue is over-full • writer.ready • Returns a Promise that resolves when the desired size of the stream's internal queue transitions from non-positive to positive, signaling that it is no longer applying backpressure. 22
  23. 23 Fetch Web Streams

  24. getReader() read() value Uint8Array done Response fetch(url).then(response => { //

    response.body is stream const reader = response.body. ; return reader. .then(function processResult(result) { console.log(result. ); // chunk content, if (result. ) return console.log("Fetch complete"); return reader.read().then(processResult); // read more }} http://jsbin.com/vuqasa/edit?js,console 24
  25. Cancelling A stream can be cancelled using stream.cancel() • response.body.cancel()

    • reader.cancel() https://jsbin.com/gameboy/edit?js,console 25
  26. 'fetch' respondWith Service Worker self.addEventListener( , event => { event.

    (new Response(stream, { headers: {'Content-Type': 'text/html'} }));}) https://glitch.com/edit/#!/html-sw-stream?path=public/sw.js:1:0 https://jakearchibald.github.io/isserviceworkerready/demos/simple-stream/ 26
  27. async for await chunk of response.body Async iterators function getResponseSize(url)

    { const response = await fetch(url); let total = 0; (const ) { total += chunk.length; } return total;} 27
  28. response.headers.get('Content-Length') value: {length} chunk.value.length / parseInt(contentLength, 10) Downloading progress const

    response = await fetch(url); const reader = response.body.getReader(); const contentLength = ; const { = {}, done} = await reader.read(); 28
  29. Other possible applications 1. Gzip/deflate 2. Audio/video/image codecs 3. CSV-to-JSON

    converter 4. The streaming HTML/XML parser 29
  30. Browsers Support • ReadableStream / WritableStream • Firefox 57+ (not

    enabled yet) • Service Worker respondWith stream • Firefox 58+ • Async iterators: • Chrome 63+ / Firefox 57+ 30
  31. Recap • Streams handle sources of data in a memory

    efficient way • Web Streams is available with Fetch API • Handle Web Stream in Service Worker • Better performance for Progressive Web App 31
  32. Thanks! @kidwm Ref: • 2016 - the year of web

    streams by Jake Archibald • A Guide to Faster Web App I/O and Data Operations with Streams by Umar Hansa 32