Upgrade to Pro — share decks privately, control downloads, hide ads and more …

It’s about time to embrace Streams

It’s about time to embrace Streams

In my experience with JavaScript/Node.js devs, I find out that Streams is a topic still mostly unexplored and sometimes even feared. This is probably one of the most beautiful features of Node.js and it can be adopted also in the browser. Once you grasp the fundamentals, you'll be able to solve some ordinary programming challenges in a much more elegant and efficient way, so this is a very interesting topic. In this talk, I will cover the following topics: Streams: when and how; Different types of streams; Built-in and custom streams; Composability; Stream utils; Streams in the browser.

Luciano Mammino

May 03, 2019
Tweet

More Decks by Luciano Mammino

Other Decks in Technology

Transcript

  1. Luciano Mammino (@loige) IT’S ABOUT TIME TO IT’S ABOUT TIME

    TO EMBRACE STREAMS EMBRACE STREAMS loige.link/streams-cityjs London - May 3, 2019 1
  2. // buffer-copy.js const { readFileSync, writeFileSync } = require('fs') const

    [,, src, dest] = process.argv // read entire file content const content = readFileSync(src) // write that content somewhere else writeFileSync(dest, content) @loige 2
  3. // buffer-copy.js const { readFileSync, writeFileSync } = require('fs') const

    [,, src, dest] = process.argv // read entire file content const content = readFileSync(src) // write that content somewhere else writeFileSync(dest, content) @loige 2
  4. // buffer-copy.js const { readFileSync, writeFileSync } = require('fs') const

    [,, src, dest] = process.argv // read entire file content const content = readFileSync(src) // write that content somewhere else writeFileSync(dest, content) @loige 2
  5. // buffer-copy.js const { readFileSync, writeFileSync } = require('fs') const

    [,, src, dest] = process.argv // read entire file content const content = readFileSync(src) // write that content somewhere else writeFileSync(dest, content) @loige 2
  6. WE DO THIS ALL THE TIME WE DO THIS ALL

    THE TIME AND IT'S OK AND IT'S OK @loige 3
  7. WE DO THIS ALL THE TIME WE DO THIS ALL

    THE TIME AND IT'S OK AND IT'S OK BUT SOMETIMES ... BUT SOMETIMES ... @loige 3
  8. WHAT CAN WE DO IF WE HAVE TO WHAT CAN

    WE DO IF WE HAVE TO MOVE MANY BLOCKS? MOVE MANY BLOCKS? @loige 9
  9. WE CAN MOVE THEM ONE BY ONE! WE CAN MOVE

    THEM ONE BY ONE! @loige we stream them... 10
  10. 11

  11. HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! Cloud Architect

    Blog: Twitter: GitHub: loige.co @loige @lmammino 11
  12. HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! Cloud Architect

    Blog: Twitter: GitHub: loige.co @loige @lmammino 11
  13. BUFFER BUFFER: DATA STRUCTURE TO STORE AND : DATA STRUCTURE

    TO STORE AND TRANSFER ARBITRARY BINARY DATA TRANSFER ARBITRARY BINARY DATA @loige *Note that this is loading all the content of the file in memory * 14
  14. STREAM STREAM: ABSTRACT INTERFACE FOR : ABSTRACT INTERFACE FOR WORKING

    WITH STREAMING DATA WORKING WITH STREAMING DATA @loige *It does not load all the data straight away * 15
  15. FILE COPY: FILE COPY: THE BUFFER WAY THE BUFFER WAY

    @loige // buffer-copy.js const { readFileSync, writeFileSync } = require('fs') const [,, src, dest] = process.argv const content = readFileSync(src) writeFileSync(dest, content) 16
  16. FILE COPY: FILE COPY: THE STREAM WAY THE STREAM WAY

    // stream-copy.js const { createReadStream, createWriteStream } = require('fs') const [,, src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', (data) => destStream.write(data)) @loige * Careful: this implementation is not optimal * 17
  17. FILE COPY: FILE COPY: THE STREAM WAY THE STREAM WAY

    // stream-copy.js const { createReadStream, createWriteStream } = require('fs') const [,, src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', (data) => destStream.write(data)) @loige * Careful: this implementation is not optimal * 17
  18. FILE COPY: FILE COPY: THE STREAM WAY THE STREAM WAY

    // stream-copy.js const { createReadStream, createWriteStream } = require('fs') const [,, src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', (data) => destStream.write(data)) @loige * Careful: this implementation is not optimal * 17
  19. FILE COPY: FILE COPY: THE STREAM WAY THE STREAM WAY

    // stream-copy.js const { createReadStream, createWriteStream } = require('fs') const [,, src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', (data) => destStream.write(data)) @loige * Careful: this implementation is not optimal * 17
  20. MEMORY COMPARISON (~600MB FILE) MEMORY COMPARISON (~600MB FILE) node ­­inspect­brk

    buffer­copy.js assets/poster.psd ~/Downloads/poster.psd @loige 18
  21. MEMORY COMPARISON (~600MB FILE) MEMORY COMPARISON (~600MB FILE) node ­­inspect­brk

    stream­copy.js assets/poster.psd ~/Downloads/poster.psd @loige 19
  22. LET'S TRY WITH A BIG FILE (~10GB) LET'S TRY WITH

    A BIG FILE (~10GB) node ­­inspect­brk stream­copy.js assets/the­matrix­hd.mkv ~/Downloads/the­matrix­hd.mkv @loige 20
  23. STREAMS VS BUFFERS STREAMS VS BUFFERS Streams keep a low

    memory footprint even with large amounts of data Streams allows you to process data as soon as it arrives @loige 21
  24. ALL STREAMS ARE ALL STREAMS ARE EVENT EMITTERS EVENT EMITTERS

    A stream instance is an object that emits events when its internal state changes, for instance: s.on('readable', () => {}) // ready to be consumed s.on('data', (chunk) => {}) // new data is available s.on('error', (err) => {}) // some error happened s.on('end', () => {}) // no more data available The events available depend from the type of stream @loige 23
  25. READABLE READABLE STREAMS STREAMS A readable stream represents a source

    from which data is consumed. Examples: fs readStream process.stdin HTTP response (client-side) HTTP request (server-side) AWS S3 GetObject (data field) It supports two modes for data consumption: flowing and paused (or non- flowing) mode. @loige 24
  26. READABLE STREAMS READABLE STREAMS Data is read from source automatically

    and chunks are emitted as soon as they are available. @loige 25
  27. @loige 1 2 3 Source data Readable stream in flowing

    mode data listener READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 26
  28. @loige 1 2 3 Source data Readable stream in flowing

    mode Read data listener READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 27
  29. @loige 1 2 3 Source data Readable stream in flowing

    mode data listener data READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 28
  30. @loige 2 3 Source data Readable stream in flowing mode

    data listener Read READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 29
  31. @loige 2 3 Source data Readable stream in flowing mode

    data listener data READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 30
  32. @loige 3 Source data Readable stream in flowing mode data

    listener Read READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 31
  33. @loige 3 Source data Readable stream in flowing mode data

    listener data READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 32
  34. @loige Source data Readable stream in flowing mode Read data

    listener (end) READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 33
  35. @loige Source data Readable stream in flowing mode data listener

    end (end) When no more data is available, end is emitted. READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 34
  36. // count-emojis-flowing.js const { createReadStream } = require('fs') const {

    EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) @loige 35
  37. // count-emojis-flowing.js const { createReadStream } = require('fs') const {

    EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) @loige 35
  38. WRITABLE WRITABLE STREAMS STREAMS A writable stream is an abstraction

    that allows to write data over a destination Examples: fs writeStream process.stdout, process.stderr HTTP request (client-side) HTTP response (server-side) AWS S3 PutObject (body parameter) @loige 37
  39. // writable-http-request.js const http = require('http') const req = http.request(

    { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...\n') req.end('last write & close the stream') @loige 38
  40. // writable-http-request.js const http = require('http') const req = http.request(

    { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...\n') req.end('last write & close the stream') @loige 38
  41. // writable-http-request.js const http = require('http') const req = http.request(

    { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...\n') req.end('last write & close the stream') @loige 38
  42. // writable-http-request.js const http = require('http') const req = http.request(

    { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...\n') req.end('last write & close the stream') @loige 38
  43. BACKPRESSURE BACKPRESSURE When writing large amounts of data you should

    make sure you handle the stop write signal and the drain event loige.link/backpressure @loige 41
  44. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const

    [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) @loige 42
  45. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const

    [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) @loige 42
  46. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const

    [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) @loige 42
  47. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const

    [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) @loige 42
  48. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const

    [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) @loige 42
  49. OTHER TYPES OF STREAM OTHER TYPES OF STREAM Duplex Stream

    streams that are both Readable and Writable. (net.Socket) Transform Stream Duplex streams that can modify or transform the data as it is written and read. (zlib.createGzip(), crypto.createCipheriv()) @loige 43
  50. ANATOMY OF A TRANSFORM STREAM ANATOMY OF A TRANSFORM STREAM

    1. write data transform stream 3. read transformed data 2. transform the data (readable stream) (writable stream) @loige 44
  51. GZIP EXAMPLE GZIP EXAMPLE 1. write data transform stream 3.

    read transformed data 2. transform the data (readable stream) (writable stream) @loige Uncompressed data Compressed data compress zlib.createGzip() 45
  52. gzipStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue)

    { gzipStream.pause() destStream.once('drain', () => { gzipStream.resume() }) } }) gzipStream.on('end', () => { destStream.end() }) // ⚠ TODO: handle errors! // stream-copy-gzip.js const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = gzipStream.write(data) if (!canContinue) { srcStream.pause() gzipStream.once('drain', () => { srcStream.resume() }) } }) srcStream.on('end', () => { // check if there's buffered data left const remainingData = gzipStream.read() if (remainingData !== null) { destStream.write() } gzipStream.end() }) @loige 46
  53. gzipStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue)

    { gzipStream.pause() destStream.once('drain', () => { gzipStream.resume() }) } }) gzipStream.on('end', () => { destStream.end() }) // ⚠ TODO: handle errors! // stream-copy-gzip.js const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = gzipStream.write(data) if (!canContinue) { srcStream.pause() gzipStream.once('drain', () => { srcStream.resume() }) } }) srcStream.on('end', () => { // check if there's buffered data left const remainingData = gzipStream.read() if (remainingData !== null) { destStream.write() } gzipStream.end() }) @loige 46
  54. readable .pipe(tranform1) .pipe(transform2) .pipe(transform3) .pipe(writable) readable.pipe(writableDest) @loige Connects a readable

    stream to a writable stream A transform stream can be used as a destination as well It returns the destination stream allowing for a chain of pipes 48
  55. // stream-copy-gzip-pipe.js const { createReadStream, createWriteStream } = require('fs') const

    { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream .pipe(gzipStream) .pipe(destStream) @loige 49
  56. // stream-copy-gzip-pipe.js const { createReadStream, createWriteStream } = require('fs') const

    { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream .pipe(gzipStream) .pipe(destStream) @loige 49
  57. // stream-copy-gzip-pipe.js const { createReadStream, createWriteStream } = require('fs') const

    { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream .pipe(gzipStream) .pipe(destStream) @loige 49
  58. readable .on('error', handleErr) .pipe(decompress) .on('error', handleErr) .pipe(decrypt) .on('error', handleErr) .pipe(convert)

    .on('error', handleErr) .pipe(encrypt) .on('error', handleErr) .pipe(compress) .on('error', handleErr) .pipe(writeToDisk) .on('error', handleErr) Handling errors (correctly) @loige 51
  59. readable .on('error', handleErr) .pipe(decompress) .on('error', handleErr) .pipe(decrypt) .on('error', handleErr) .pipe(convert)

    .on('error', handleErr) .pipe(encrypt) .on('error', handleErr) .pipe(compress) .on('error', handleErr) .pipe(writeToDisk) .on('error', handleErr) Handling errors (correctly) @loige handleErr should end and destroy the streams (it doesn't happen automatically) 51
  60. readable .on('error', handleErr) .pipe(decompress) .on('error', handleErr) .pipe(decrypt) .on('error', handleErr) .pipe(convert)

    .on('error', handleErr) .pipe(encrypt) .on('error', handleErr) .pipe(compress) .on('error', handleErr) .pipe(writeToDisk) .on('error', handleErr) Handling errors (correctly) @loige handleErr should end and destroy the streams (it doesn't happen automatically) 51
  61. // stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const {

    createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) stream.pipeline(...streams, callback) @loige 53
  62. // stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const {

    createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) stream.pipeline(...streams, callback) @loige Can pass multiple streams (they will be piped) 53
  63. // stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const {

    createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) stream.pipeline(...streams, callback) @loige Can pass multiple streams (they will be piped) The last argument is a callback. If invoked with an error, it means the pipeline failed at some point. All the streams are ended and destroyed correctly. 53
  64. readable-stream - Npm package that contains the latest version of

    Node.js stream library. It also makes Node.js streams compatible with the browser (can be used with Webpack and Broswserify) npm.im/readable-stream @loige * yeah, the name is misleading. The package offers all the functionalities in the official 'stream' package, not just readable streams. * 54
  65. @loige EmojiStream Uppercasify DOMAppend LEMON class EmojiStream extends Readable {

    _read() { // ... } } class Uppercasify extends Transform { _transform( chunk, enc, done ) { // ... } } BANANA 56
  66. @loige EmojiStream Uppercasify DOMAppend LEMON class EmojiStream extends Readable {

    _read() { // ... } } class Uppercasify extends Transform { _transform( chunk, enc, done ) { // ... } } class DOMAppend extends Writable { _write( chunk, enc, done ) { // ... } } BANANA 56
  67. @loige EmojiStream Uppercasify DOMAppend LEMON class EmojiStream extends Readable {

    _read() { // ... } } class Uppercasify extends Transform { _transform( chunk, enc, done ) { // ... } } class DOMAppend extends Writable { _write( chunk, enc, done ) { // ... } } BANANA this.push(data) pass data to the next step 56
  68. // emoji-stream.js (custom readable stream) const { EMOJI_MAP } =

    require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 57
  69. // emoji-stream.js (custom readable stream) const { EMOJI_MAP } =

    require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 57
  70. // emoji-stream.js (custom readable stream) const { EMOJI_MAP } =

    require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 57
  71. // emoji-stream.js (custom readable stream) const { EMOJI_MAP } =

    require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 57
  72. // emoji-stream.js (custom readable stream) const { EMOJI_MAP } =

    require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 57
  73. // emoji-stream.js (custom readable stream) const { EMOJI_MAP } =

    require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 57
  74. // uppercasify.js (custom transform stream) const { Transform } =

    require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } } module.exports = Uppercasify @loige 58
  75. // uppercasify.js (custom transform stream) const { Transform } =

    require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } } module.exports = Uppercasify @loige 58
  76. // uppercasify.js (custom transform stream) const { Transform } =

    require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } } module.exports = Uppercasify @loige 58
  77. // uppercasify.js (custom transform stream) const { Transform } =

    require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } } module.exports = Uppercasify @loige 58
  78. // uppercasify.js (custom transform stream) const { Transform } =

    require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } } module.exports = Uppercasify @loige 58
  79. // uppercasify.js (custom transform stream) const { Transform } =

    require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } } module.exports = Uppercasify @loige 58
  80. // dom-append.js (custom writable stream) const { Writable } =

    require('readable-stream') class DOMAppend extends Writable { _write (chunk, encoding, done) { const elem = document.createElement('li') const content = document.createTextNode(chunk.toString()) elem.appendChild(content) document.getElementById('list').appendChild(elem) done() } } module.exports = DOMAppend @loige 59
  81. // dom-append.js (custom writable stream) const { Writable } =

    require('readable-stream') class DOMAppend extends Writable { _write (chunk, encoding, done) { const elem = document.createElement('li') const content = document.createTextNode(chunk.toString()) elem.appendChild(content) document.getElementById('list').appendChild(elem) done() } } module.exports = DOMAppend @loige 59
  82. // dom-append.js (custom writable stream) const { Writable } =

    require('readable-stream') class DOMAppend extends Writable { _write (chunk, encoding, done) { const elem = document.createElement('li') const content = document.createTextNode(chunk.toString()) elem.appendChild(content) document.getElementById('list').appendChild(elem) done() } } module.exports = DOMAppend @loige 59
  83. // dom-append.js (custom writable stream) const { Writable } =

    require('readable-stream') class DOMAppend extends Writable { _write (chunk, encoding, done) { const elem = document.createElement('li') const content = document.createTextNode(chunk.toString()) elem.appendChild(content) document.getElementById('list').appendChild(elem) done() } } module.exports = DOMAppend @loige 59
  84. // dom-append.js (custom writable stream) const { Writable } =

    require('readable-stream') class DOMAppend extends Writable { _write (chunk, encoding, done) { const elem = document.createElement('li') const content = document.createTextNode(chunk.toString()) elem.appendChild(content) document.getElementById('list').appendChild(elem) done() } } module.exports = DOMAppend @loige 59
  85. // dom-append.js (custom writable stream) const { Writable } =

    require('readable-stream') class DOMAppend extends Writable { _write (chunk, encoding, done) { const elem = document.createElement('li') const content = document.createTextNode(chunk.toString()) elem.appendChild(content) document.getElementById('list').appendChild(elem) done() } } module.exports = DOMAppend @loige 59
  86. // browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify')

    const DOMAppend = require('../dom-append') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend() emoji .pipe(uppercasify) .pipe(append) @loige 61
  87. // browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify')

    const DOMAppend = require('../dom-append') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend() emoji .pipe(uppercasify) .pipe(append) @loige 61
  88. // browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify')

    const DOMAppend = require('../dom-append') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend() emoji .pipe(uppercasify) .pipe(append) @loige 61
  89. // browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify')

    const DOMAppend = require('../dom-append') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend() emoji .pipe(uppercasify) .pipe(append) @loige 61
  90. npm i --save-dev webpack webpack-cli node_modules/.bin/webpack src/browser/app.js # creates dist/main.js

    mv dist/main.js src/browser/app-bundle.js @loige Let's use webpack to build this app for the browser 62
  91. <!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8" /> <meta name="viewport"

    content="width=device-width, initial-scale=1, shrink-to-fit=no" /> <title>Streams in the browser!</title> </head> <body> <ul id="list"></ul> <script src="app.bundle.js"></script> </body> </html> @loige Finally let's create an index.html 63
  92. <!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8" /> <meta name="viewport"

    content="width=device-width, initial-scale=1, shrink-to-fit=no" /> <title>Streams in the browser!</title> </head> <body> <ul id="list"></ul> <script src="app.bundle.js"></script> </body> </html> @loige Finally let's create an index.html 63
  93. <!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8" /> <meta name="viewport"

    content="width=device-width, initial-scale=1, shrink-to-fit=no" /> <title>Streams in the browser!</title> </head> <body> <ul id="list"></ul> <script src="app.bundle.js"></script> </body> </html> @loige Finally let's create an index.html 63
  94. Streams have low memory footprint Process data as soon as

    it's available Composition through pipelines Streams are abstractions: Readable = Input Transform = Business Logic Writable = Output @loige TLDR; TLDR; 66
  95. IF YOU WANT TO LEARN (EVEN) MOAR IF YOU WANT

    TO LEARN (EVEN) MOAR ABOUT STREAMS... ABOUT STREAMS... nodejs.org/api/stream.html github.com/substack/stream-handbook @loige 67
  96. IF YOU ARE NOT CONVINCED YET... IF YOU ARE NOT

    CONVINCED YET... @loige curl parrot.live 68
  97. CREDITS CREDITS on for the cover picture for the amazing

    St. Patrick emoji art The internet for the memes! :D Dan Roizer Unsplash emojiart.org SPECIAL THANKS SPECIAL THANKS , , , , , @StefanoAbalsamo @mariocasciaro @machine_person @Podgeypoos79 @katavic_d @UrsoLuca @loige 72