It’s about time to embrace Node.js Streams - Node.js Austin meetup

It’s about time to embrace Node.js Streams - Node.js Austin meetup

With very practical examples we'll learn how streams work in Node.js & the Browser. With streams, you will be able to write elegant JavaScript applications that are much more composable and memory efficient! Streams are probably one of the most beautiful features of Node.js, but still largely underestimated and rarely used. Once you'll grasp the fundamentals, you'll be able to solve some ordinary programming challenges in a much more elegant and efficient way. With streams power in your tool belt, you'll be able to write applications that can deal with gigabytes or even terabytes of data efficiently. This talk will cover the following topics: Streams: when and how; Different types of streams; Built-in and custom streams; Composability; Utils & Streams in the browser.

F3a6662b3cd161c3c2f13604965ed0f2?s=128

Luciano Mammino

August 21, 2019
Tweet

Transcript

  1. Luciano Mammino (@loige) IT’S ABOUT TIME TO IT’S ABOUT TIME

    TO EMBRACE NODE.JS STREAMS EMBRACE NODE.JS STREAMS loige.link/streams-austin MEETUP Austin, TX August 22nd 1
  2. // buffer-copy.js const { readFileSync, writeFileSync } = require('fs') const

    [,, src, dest] = process.argv // read entire file content const content = readFileSync(src) // write that content somewhere else writeFileSync(dest, content) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 @loige 2
  3. // buffer-copy.js const { readFileSync, writeFileSync } = require('fs') const

    [,, src, dest] = process.argv // read entire file content const content = readFileSync(src) // write that content somewhere else writeFileSync(dest, content) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 const { readFileSync, writeFileSync } = require('fs') // buffer-copy.js 1 2 3 4 5 6 7 const [,, src, dest] = process.argv 8 9 // read entire file content 10 const content = readFileSync(src) 11 12 // write that content somewhere else 13 writeFileSync(dest, content) 14 @loige 2
  4. // buffer-copy.js const { readFileSync, writeFileSync } = require('fs') const

    [,, src, dest] = process.argv // read entire file content const content = readFileSync(src) // write that content somewhere else writeFileSync(dest, content) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 const { readFileSync, writeFileSync } = require('fs') // buffer-copy.js 1 2 3 4 5 6 7 const [,, src, dest] = process.argv 8 9 // read entire file content 10 const content = readFileSync(src) 11 12 // write that content somewhere else 13 writeFileSync(dest, content) 14 // read entire file content const content = readFileSync(src) // buffer-copy.js 1 2 const { 3 readFileSync, 4 writeFileSync 5 } = require('fs') 6 7 const [,, src, dest] = process.argv 8 9 10 11 12 // write that content somewhere else 13 writeFileSync(dest, content) 14 @loige 2
  5. // buffer-copy.js const { readFileSync, writeFileSync } = require('fs') const

    [,, src, dest] = process.argv // read entire file content const content = readFileSync(src) // write that content somewhere else writeFileSync(dest, content) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 const { readFileSync, writeFileSync } = require('fs') // buffer-copy.js 1 2 3 4 5 6 7 const [,, src, dest] = process.argv 8 9 // read entire file content 10 const content = readFileSync(src) 11 12 // write that content somewhere else 13 writeFileSync(dest, content) 14 // read entire file content const content = readFileSync(src) // buffer-copy.js 1 2 const { 3 readFileSync, 4 writeFileSync 5 } = require('fs') 6 7 const [,, src, dest] = process.argv 8 9 10 11 12 // write that content somewhere else 13 writeFileSync(dest, content) 14 // write that content somewhere else writeFileSync(dest, content) // buffer-copy.js 1 2 const { 3 readFileSync, 4 writeFileSync 5 } = require('fs') 6 7 const [,, src, dest] = process.argv 8 9 // read entire file content 10 const content = readFileSync(src) 11 12 13 14 @loige 2
  6. @loige 3

  7. WE DO THIS ALL THE TIME WE DO THIS ALL

    THE TIME @loige 3
  8. WE DO THIS ALL THE TIME WE DO THIS ALL

    THE TIME AND IT'S OK AND IT'S OK @loige 3
  9. WE DO THIS ALL THE TIME WE DO THIS ALL

    THE TIME AND IT'S OK AND IT'S OK BUT SOMETIMES ... BUT SOMETIMES ... @loige 3
  10. @loige ERR_FS_FILE_TOO_LARGE! ERR_FS_FILE_TOO_LARGE! File size is greater than possible Buffer

    4
  11. BUT WHY? BUT WHY? @loige 5

  12. IF BYTES IF BYTES WERE WERE BLOCKS... BLOCKS... @loige 6

  13. MARIO CAN LIFT MARIO CAN LIFT FEW BLOCKS FEW BLOCKS

    @loige 7
  14. BUT NOT TOO MANY... BUT NOT TOO MANY... @loige ?!

    8
  15. WHAT CAN WE DO IF WE HAVE TO WHAT CAN

    WE DO IF WE HAVE TO MOVE MANY BLOCKS? MOVE MANY BLOCKS? @loige 9
  16. WE CAN MOVE THEM ONE BY ONE! WE CAN MOVE

    THEM ONE BY ONE! @loige we stream them... 10
  17. 11

  18. HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! 11

  19. HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! 11

  20. HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! 11

  21. HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! 11

  22. HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! Cloud Architect

    11
  23. HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! Cloud Architect

    Blog: Twitter: GitHub: loige.co @loige @lmammino 11
  24. HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! Cloud Architect

    Blog: Twitter: GitHub: loige.co @loige @lmammino 11
  25. code: loige.link/streams-examples loige.link/streams-austin 12

  26. 01. BUFFERS VS 01. BUFFERS VS STREAMS STREAMS @loige 13

  27. BUFFER BUFFER: DATA STRUCTURE TO STORE AND : DATA STRUCTURE

    TO STORE AND TRANSFER ARBITRARY BINARY DATA TRANSFER ARBITRARY BINARY DATA @loige *Note that this is loading all the content of the file in memory * 14
  28. STREAM STREAM: ABSTRACT INTERFACE FOR : ABSTRACT INTERFACE FOR WORKING

    WITH STREAMING DATA WORKING WITH STREAMING DATA @loige *It does not load all the data straight away * 15
  29. FILE COPY: FILE COPY: THE BUFFER WAY THE BUFFER WAY

    @loige // buffer-copy.js const { readFileSync, writeFileSync } = require('fs') const [,, src, dest] = process.argv const content = readFileSync(src) writeFileSync(dest, content) 1 2 3 4 5 6 7 8 9 10 16
  30. FILE COPY: FILE COPY: THE STREAM WAY THE STREAM WAY

    // stream-copy.js const { createReadStream, createWriteStream } = require('fs') const [,, src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', (data) => destStream.write(data)) 1 2 3 4 5 6 7 8 9 10 11 @loige * Careful: this implementation is not optimal * 17
  31. FILE COPY: FILE COPY: THE STREAM WAY THE STREAM WAY

    // stream-copy.js const { createReadStream, createWriteStream } = require('fs') const [,, src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', (data) => destStream.write(data)) 1 2 3 4 5 6 7 8 9 10 11 createReadStream, createWriteStream // stream-copy.js 1 2 const { 3 4 5 } = require('fs') 6 7 const [,, src, dest] = process.argv 8 const srcStream = createReadStream(src) 9 const destStream = createWriteStream(dest) 10 srcStream.on('data', (data) => destStream.write(data)) 11 @loige * Careful: this implementation is not optimal * 17
  32. FILE COPY: FILE COPY: THE STREAM WAY THE STREAM WAY

    // stream-copy.js const { createReadStream, createWriteStream } = require('fs') const [,, src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', (data) => destStream.write(data)) 1 2 3 4 5 6 7 8 9 10 11 createReadStream, createWriteStream // stream-copy.js 1 2 const { 3 4 5 } = require('fs') 6 7 const [,, src, dest] = process.argv 8 const srcStream = createReadStream(src) 9 const destStream = createWriteStream(dest) 10 srcStream.on('data', (data) => destStream.write(data)) 11 const srcStream = createReadStream(src) const destStream = createWriteStream(dest) // stream-copy.js 1 2 const { 3 createReadStream, 4 createWriteStream 5 } = require('fs') 6 7 const [,, src, dest] = process.argv 8 9 10 srcStream.on('data', (data) => destStream.write(data)) 11 @loige * Careful: this implementation is not optimal * 17
  33. FILE COPY: FILE COPY: THE STREAM WAY THE STREAM WAY

    // stream-copy.js const { createReadStream, createWriteStream } = require('fs') const [,, src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', (data) => destStream.write(data)) 1 2 3 4 5 6 7 8 9 10 11 createReadStream, createWriteStream // stream-copy.js 1 2 const { 3 4 5 } = require('fs') 6 7 const [,, src, dest] = process.argv 8 const srcStream = createReadStream(src) 9 const destStream = createWriteStream(dest) 10 srcStream.on('data', (data) => destStream.write(data)) 11 const srcStream = createReadStream(src) const destStream = createWriteStream(dest) // stream-copy.js 1 2 const { 3 createReadStream, 4 createWriteStream 5 } = require('fs') 6 7 const [,, src, dest] = process.argv 8 9 10 srcStream.on('data', (data) => destStream.write(data)) 11 srcStream.on('data', (data) => destStream.write(data)) // stream-copy.js 1 2 const { 3 createReadStream, 4 createWriteStream 5 } = require('fs') 6 7 const [,, src, dest] = process.argv 8 const srcStream = createReadStream(src) 9 const destStream = createWriteStream(dest) 10 11 @loige * Careful: this implementation is not optimal * 17
  34. MEMORY COMPARISON (~600MB FILE) MEMORY COMPARISON (~600MB FILE) node ­­inspect­brk

    buffer­copy.js assets/poster.psd ~/Downloads/poster.psd @loige 18
  35. MEMORY COMPARISON (~600MB FILE) MEMORY COMPARISON (~600MB FILE) node ­­inspect­brk

    stream­copy.js assets/poster.psd ~/Downloads/poster.psd @loige 19
  36. LET'S TRY WITH A BIG FILE (~10GB) LET'S TRY WITH

    A BIG FILE (~10GB) @loige 20
  37. LET'S TRY WITH A BIG FILE (~10GB) LET'S TRY WITH

    A BIG FILE (~10GB) node ­­inspect­brk stream­copy.js assets/the­matrix­hd.mkv ~/Downloads/the­matrix­hd.mkv @loige 21
  38. STREAMS VS BUFFERS STREAMS VS BUFFERS Streams keep a low

    memory footprint even with large amounts of data Streams allow you to process data as soon as it arrives @loige 22
  39. 03. STREAM TYPES 03. STREAM TYPES & APIS & APIS

    @loige 23
  40. ALL STREAMS ARE ALL STREAMS ARE EVENT EMITTERS EVENT EMITTERS

    A stream instance is an object that emits events when its internal state changes, for instance: s.on('readable', () => {}) // ready to be consumed s.on('data', (chunk) => {}) // new data is available s.on('error', (err) => {}) // some error happened s.on('end', () => {}) // no more data available The events available depend from the type of stream @loige 24
  41. READABLE READABLE STREAMS STREAMS A readable stream represents a source

    from which data is consumed. Examples: fs readStream process.stdin HTTP response (client-side) HTTP request (server-side) AWS S3 GetObject (data field) It supports two modes for data consumption: flowing and paused (or non- flowing) mode. @loige 25
  42. READABLE STREAMS READABLE STREAMS Data is read from source automatically

    and chunks are emitted as soon as they are available. @loige 26
  43. @loige 1 2 3 Source data Readable stream in flowing

    mode data listener READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 27
  44. @loige 1 2 3 Source data Readable stream in flowing

    mode Read data listener READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 28
  45. @loige 1 2 3 Source data Readable stream in flowing

    mode data listener data READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 29
  46. @loige 2 3 Source data Readable stream in flowing mode

    data listener Read READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 30
  47. @loige 2 3 Source data Readable stream in flowing mode

    data listener data READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 31
  48. @loige 3 Source data Readable stream in flowing mode data

    listener Read READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 32
  49. @loige 3 Source data Readable stream in flowing mode data

    listener data READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 33
  50. @loige Source data Readable stream in flowing mode Read data

    listener (end) READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 34
  51. @loige Source data Readable stream in flowing mode data listener

    end (end) When no more data is available, end is emitted. READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 35
  52. // count-emojis-flowing.js const { createReadStream } = require('fs') const {

    EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 @loige 36
  53. // count-emojis-flowing.js const { createReadStream } = require('fs') const {

    EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 const { EMOJI_MAP } = require('emoji') // from npm // count-emojis-flowing.js 1 2 const { createReadStream } = require('fs') 3 4 5 const emojis = Object.keys(EMOJI_MAP) 6 7 const file = createReadStream(process.argv[2]) 8 let counter = 0 9 10 file.on('data', chunk => { 11 for (let char of chunk.toString('utf8')) { 12 if (emojis.includes(char)) { 13 counter++ 14 } 15 } 16 }) 17 file.on('end', () => console.log(`Found ${counter} emojis`)) 18 file.on('error', err => console.error(`Error reading file: ${err}`)) 19 @loige 36
  54. // count-emojis-flowing.js const { createReadStream } = require('fs') const {

    EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 const { EMOJI_MAP } = require('emoji') // from npm // count-emojis-flowing.js 1 2 const { createReadStream } = require('fs') 3 4 5 const emojis = Object.keys(EMOJI_MAP) 6 7 const file = createReadStream(process.argv[2]) 8 let counter = 0 9 10 file.on('data', chunk => { 11 for (let char of chunk.toString('utf8')) { 12 if (emojis.includes(char)) { 13 counter++ 14 } 15 } 16 }) 17 file.on('end', () => console.log(`Found ${counter} emojis`)) 18 file.on('error', err => console.error(`Error reading file: ${err}`)) 19 const file = createReadStream(process.argv[2]) // count-emojis-flowing.js 1 2 const { createReadStream } = require('fs') 3 const { EMOJI_MAP } = require('emoji') // from npm 4 5 const emojis = Object.keys(EMOJI_MAP) 6 7 8 let counter = 0 9 10 file.on('data', chunk => { 11 for (let char of chunk.toString('utf8')) { 12 if (emojis.includes(char)) { 13 counter++ 14 } 15 } 16 }) 17 file.on('end', () => console.log(`Found ${counter} emojis`)) 18 file.on('error', err => console.error(`Error reading file: ${err}`)) 19 @loige 36
  55. // count-emojis-flowing.js const { createReadStream } = require('fs') const {

    EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 const { EMOJI_MAP } = require('emoji') // from npm // count-emojis-flowing.js 1 2 const { createReadStream } = require('fs') 3 4 5 const emojis = Object.keys(EMOJI_MAP) 6 7 const file = createReadStream(process.argv[2]) 8 let counter = 0 9 10 file.on('data', chunk => { 11 for (let char of chunk.toString('utf8')) { 12 if (emojis.includes(char)) { 13 counter++ 14 } 15 } 16 }) 17 file.on('end', () => console.log(`Found ${counter} emojis`)) 18 file.on('error', err => console.error(`Error reading file: ${err}`)) 19 const file = createReadStream(process.argv[2]) // count-emojis-flowing.js 1 2 const { createReadStream } = require('fs') 3 const { EMOJI_MAP } = require('emoji') // from npm 4 5 const emojis = Object.keys(EMOJI_MAP) 6 7 8 let counter = 0 9 10 file.on('data', chunk => { 11 for (let char of chunk.toString('utf8')) { 12 if (emojis.includes(char)) { 13 counter++ 14 } 15 } 16 }) 17 file.on('end', () => console.log(`Found ${counter} emojis`)) 18 file.on('error', err => console.error(`Error reading file: ${err}`)) 19 let counter = 0 // count-emojis-flowing.js 1 2 const { createReadStream } = require('fs') 3 const { EMOJI_MAP } = require('emoji') // from npm 4 5 const emojis = Object.keys(EMOJI_MAP) 6 7 const file = createReadStream(process.argv[2]) 8 9 10 file.on('data', chunk => { 11 for (let char of chunk.toString('utf8')) { 12 if (emojis.includes(char)) { 13 counter++ 14 } 15 } 16 }) 17 file.on('end', () => console.log(`Found ${counter} emojis`)) 18 file.on('error', err => console.error(`Error reading file: ${err}`)) 19 @loige 36
  56. // count-emojis-flowing.js const { createReadStream } = require('fs') const {

    EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 const { EMOJI_MAP } = require('emoji') // from npm // count-emojis-flowing.js 1 2 const { createReadStream } = require('fs') 3 4 5 const emojis = Object.keys(EMOJI_MAP) 6 7 const file = createReadStream(process.argv[2]) 8 let counter = 0 9 10 file.on('data', chunk => { 11 for (let char of chunk.toString('utf8')) { 12 if (emojis.includes(char)) { 13 counter++ 14 } 15 } 16 }) 17 file.on('end', () => console.log(`Found ${counter} emojis`)) 18 file.on('error', err => console.error(`Error reading file: ${err}`)) 19 const file = createReadStream(process.argv[2]) // count-emojis-flowing.js 1 2 const { createReadStream } = require('fs') 3 const { EMOJI_MAP } = require('emoji') // from npm 4 5 const emojis = Object.keys(EMOJI_MAP) 6 7 8 let counter = 0 9 10 file.on('data', chunk => { 11 for (let char of chunk.toString('utf8')) { 12 if (emojis.includes(char)) { 13 counter++ 14 } 15 } 16 }) 17 file.on('end', () => console.log(`Found ${counter} emojis`)) 18 file.on('error', err => console.error(`Error reading file: ${err}`)) 19 let counter = 0 // count-emojis-flowing.js 1 2 const { createReadStream } = require('fs') 3 const { EMOJI_MAP } = require('emoji') // from npm 4 5 const emojis = Object.keys(EMOJI_MAP) 6 7 const file = createReadStream(process.argv[2]) 8 9 10 file.on('data', chunk => { 11 for (let char of chunk.toString('utf8')) { 12 if (emojis.includes(char)) { 13 counter++ 14 } 15 } 16 }) 17 file.on('end', () => console.log(`Found ${counter} emojis`)) 18 file.on('error', err => console.error(`Error reading file: ${err}`)) 19 file.on('data', chunk => { }) // count-emojis-flowing.js 1 2 const { createReadStream } = require('fs') 3 const { EMOJI_MAP } = require('emoji') // from npm 4 5 const emojis = Object.keys(EMOJI_MAP) 6 7 const file = createReadStream(process.argv[2]) 8 let counter = 0 9 10 11 for (let char of chunk.toString('utf8')) { 12 if (emojis.includes(char)) { 13 counter++ 14 } 15 } 16 17 file.on('end', () => console.log(`Found ${counter} emojis`)) 18 file.on('error', err => console.error(`Error reading file: ${err}`)) 19 @loige 36
  57. // count-emojis-flowing.js const { createReadStream } = require('fs') const {

    EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 const { EMOJI_MAP } = require('emoji') // from npm // count-emojis-flowing.js 1 2 const { createReadStream } = require('fs') 3 4 5 const emojis = Object.keys(EMOJI_MAP) 6 7 const file = createReadStream(process.argv[2]) 8 let counter = 0 9 10 file.on('data', chunk => { 11 for (let char of chunk.toString('utf8')) { 12 if (emojis.includes(char)) { 13 counter++ 14 } 15 } 16 }) 17 file.on('end', () => console.log(`Found ${counter} emojis`)) 18 file.on('error', err => console.error(`Error reading file: ${err}`)) 19 const file = createReadStream(process.argv[2]) // count-emojis-flowing.js 1 2 const { createReadStream } = require('fs') 3 const { EMOJI_MAP } = require('emoji') // from npm 4 5 const emojis = Object.keys(EMOJI_MAP) 6 7 8 let counter = 0 9 10 file.on('data', chunk => { 11 for (let char of chunk.toString('utf8')) { 12 if (emojis.includes(char)) { 13 counter++ 14 } 15 } 16 }) 17 file.on('end', () => console.log(`Found ${counter} emojis`)) 18 file.on('error', err => console.error(`Error reading file: ${err}`)) 19 let counter = 0 // count-emojis-flowing.js 1 2 const { createReadStream } = require('fs') 3 const { EMOJI_MAP } = require('emoji') // from npm 4 5 const emojis = Object.keys(EMOJI_MAP) 6 7 const file = createReadStream(process.argv[2]) 8 9 10 file.on('data', chunk => { 11 for (let char of chunk.toString('utf8')) { 12 if (emojis.includes(char)) { 13 counter++ 14 } 15 } 16 }) 17 file.on('end', () => console.log(`Found ${counter} emojis`)) 18 file.on('error', err => console.error(`Error reading file: ${err}`)) 19 file.on('data', chunk => { }) // count-emojis-flowing.js 1 2 const { createReadStream } = require('fs') 3 const { EMOJI_MAP } = require('emoji') // from npm 4 5 const emojis = Object.keys(EMOJI_MAP) 6 7 const file = createReadStream(process.argv[2]) 8 let counter = 0 9 10 11 for (let char of chunk.toString('utf8')) { 12 if (emojis.includes(char)) { 13 counter++ 14 } 15 } 16 17 file.on('end', () => console.log(`Found ${counter} emojis`)) 18 file.on('error', err => console.error(`Error reading file: ${err}`)) 19 for (let char of chunk.toString('utf8')) { } // count-emojis-flowing.js 1 2 const { createReadStream } = require('fs') 3 const { EMOJI_MAP } = require('emoji') // from npm 4 5 const emojis = Object.keys(EMOJI_MAP) 6 7 const file = createReadStream(process.argv[2]) 8 let counter = 0 9 10 file.on('data', chunk => { 11 12 if (emojis.includes(char)) { 13 counter++ 14 } 15 16 }) 17 file.on('end', () => console.log(`Found ${counter} emojis`)) 18 file.on('error', err => console.error(`Error reading file: ${err}`)) 19 @loige 36
  58. // count-emojis-flowing.js const { createReadStream } = require('fs') const {

    EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 const { EMOJI_MAP } = require('emoji') // from npm // count-emojis-flowing.js 1 2 const { createReadStream } = require('fs') 3 4 5 const emojis = Object.keys(EMOJI_MAP) 6 7 const file = createReadStream(process.argv[2]) 8 let counter = 0 9 10 file.on('data', chunk => { 11 for (let char of chunk.toString('utf8')) { 12 if (emojis.includes(char)) { 13 counter++ 14 } 15 } 16 }) 17 file.on('end', () => console.log(`Found ${counter} emojis`)) 18 file.on('error', err => console.error(`Error reading file: ${err}`)) 19 const file = createReadStream(process.argv[2]) // count-emojis-flowing.js 1 2 const { createReadStream } = require('fs') 3 const { EMOJI_MAP } = require('emoji') // from npm 4 5 const emojis = Object.keys(EMOJI_MAP) 6 7 8 let counter = 0 9 10 file.on('data', chunk => { 11 for (let char of chunk.toString('utf8')) { 12 if (emojis.includes(char)) { 13 counter++ 14 } 15 } 16 }) 17 file.on('end', () => console.log(`Found ${counter} emojis`)) 18 file.on('error', err => console.error(`Error reading file: ${err}`)) 19 let counter = 0 // count-emojis-flowing.js 1 2 const { createReadStream } = require('fs') 3 const { EMOJI_MAP } = require('emoji') // from npm 4 5 const emojis = Object.keys(EMOJI_MAP) 6 7 const file = createReadStream(process.argv[2]) 8 9 10 file.on('data', chunk => { 11 for (let char of chunk.toString('utf8')) { 12 if (emojis.includes(char)) { 13 counter++ 14 } 15 } 16 }) 17 file.on('end', () => console.log(`Found ${counter} emojis`)) 18 file.on('error', err => console.error(`Error reading file: ${err}`)) 19 file.on('data', chunk => { }) // count-emojis-flowing.js 1 2 const { createReadStream } = require('fs') 3 const { EMOJI_MAP } = require('emoji') // from npm 4 5 const emojis = Object.keys(EMOJI_MAP) 6 7 const file = createReadStream(process.argv[2]) 8 let counter = 0 9 10 11 for (let char of chunk.toString('utf8')) { 12 if (emojis.includes(char)) { 13 counter++ 14 } 15 } 16 17 file.on('end', () => console.log(`Found ${counter} emojis`)) 18 file.on('error', err => console.error(`Error reading file: ${err}`)) 19 for (let char of chunk.toString('utf8')) { } // count-emojis-flowing.js 1 2 const { createReadStream } = require('fs') 3 const { EMOJI_MAP } = require('emoji') // from npm 4 5 const emojis = Object.keys(EMOJI_MAP) 6 7 const file = createReadStream(process.argv[2]) 8 let counter = 0 9 10 file.on('data', chunk => { 11 12 if (emojis.includes(char)) { 13 counter++ 14 } 15 16 }) 17 file.on('end', () => console.log(`Found ${counter} emojis`)) 18 file.on('error', err => console.error(`Error reading file: ${err}`)) 19 if (emojis.includes(char)) { counter++ } // count-emojis-flowing.js 1 2 const { createReadStream } = require('fs') 3 const { EMOJI_MAP } = require('emoji') // from npm 4 5 const emojis = Object.keys(EMOJI_MAP) 6 7 const file = createReadStream(process.argv[2]) 8 let counter = 0 9 10 file.on('data', chunk => { 11 for (let char of chunk.toString('utf8')) { 12 13 14 15 } 16 }) 17 file.on('end', () => console.log(`Found ${counter} emojis`)) 18 file.on('error', err => console.error(`Error reading file: ${err}`)) 19 @loige 36
  59. loige.link/up-emojiart @loige 37

  60. READABLE STREAMS ARE READABLE STREAMS ARE ALSO ALSO ASYNC ITERATORS

    ASYNC ITERATORS ( (NODE.JS 10+) NODE.JS 10+) @loige 38
  61. // count-emojis-async-iterator.js const { createReadStream } = require('fs') const {

    EMOJI_MAP } = require('emoji') // from npm async function main () { const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 for await (let chunk of file) { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } } console.log(`Found ${counter} emojis`) } main() 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 @loige 39
  62. // count-emojis-async-iterator.js const { createReadStream } = require('fs') const {

    EMOJI_MAP } = require('emoji') // from npm async function main () { const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 for await (let chunk of file) { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } } console.log(`Found ${counter} emojis`) } main() 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 for await (let chunk of file) { } // count-emojis-async-iterator.js 1 const { createReadStream } = require('fs') 2 const { EMOJI_MAP } = require('emoji') // from npm 3 4 async function main () { 5 const emojis = Object.keys(EMOJI_MAP) 6 const file = createReadStream(process.argv[2]) 7 let counter = 0 8 9 10 for (let char of chunk.toString('utf8')) { 11 if (emojis.includes(char)) { 12 counter++ 13 } 14 } 15 16 17 console.log(`Found ${counter} emojis`) 18 } 19 20 main() 21 @loige 39
  63. // count-emojis-async-iterator.js const { createReadStream } = require('fs') const {

    EMOJI_MAP } = require('emoji') // from npm async function main () { const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 for await (let chunk of file) { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } } console.log(`Found ${counter} emojis`) } main() 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 for await (let chunk of file) { } // count-emojis-async-iterator.js 1 const { createReadStream } = require('fs') 2 const { EMOJI_MAP } = require('emoji') // from npm 3 4 async function main () { 5 const emojis = Object.keys(EMOJI_MAP) 6 const file = createReadStream(process.argv[2]) 7 let counter = 0 8 9 10 for (let char of chunk.toString('utf8')) { 11 if (emojis.includes(char)) { 12 counter++ 13 } 14 } 15 16 17 console.log(`Found ${counter} emojis`) 18 } 19 20 main() 21 async function main () { } // count-emojis-async-iterator.js 1 const { createReadStream } = require('fs') 2 const { EMOJI_MAP } = require('emoji') // from npm 3 4 5 const emojis = Object.keys(EMOJI_MAP) 6 const file = createReadStream(process.argv[2]) 7 let counter = 0 8 9 for await (let chunk of file) { 10 for (let char of chunk.toString('utf8')) { 11 if (emojis.includes(char)) { 12 counter++ 13 } 14 } 15 } 16 17 console.log(`Found ${counter} emojis`) 18 19 20 main() 21 @loige 39
  64. WRITABLE WRITABLE STREAMS STREAMS A writable stream is an abstraction

    that allows you to write data to a destination Examples: fs writeStream process.stdout, process.stderr HTTP request (client-side) HTTP response (server-side) AWS S3 PutObject (body parameter) @loige 40
  65. // writable-http-request.js const http = require('http') const req = http.request(

    { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...\n') req.end('last write & close the stream') 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 @loige 41
  66. // writable-http-request.js const http = require('http') const req = http.request(

    { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...\n') req.end('last write & close the stream') 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 const req = http.request( ) // writable-http-request.js 1 const http = require('http') 2 3 4 { 5 hostname: 'enx6b07hdu6cs.x.pipedream.net', 6 method: 'POST' 7 }, 8 resp => { 9 console.log(`Server responded with "${resp.statusCode}"`) 10 } 11 12 13 req.on('finish', () => console.log('request sent')) 14 req.on('close', () => console.log('Connection closed')) 15 req.on('error', err => console.error(`Request failed: ${err}`)) 16 17 req.write('writing some content...\n') 18 req.end('last write & close the stream') 19 @loige 41
  67. // writable-http-request.js const http = require('http') const req = http.request(

    { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...\n') req.end('last write & close the stream') 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 const req = http.request( ) // writable-http-request.js 1 const http = require('http') 2 3 4 { 5 hostname: 'enx6b07hdu6cs.x.pipedream.net', 6 method: 'POST' 7 }, 8 resp => { 9 console.log(`Server responded with "${resp.statusCode}"`) 10 } 11 12 13 req.on('finish', () => console.log('request sent')) 14 req.on('close', () => console.log('Connection closed')) 15 req.on('error', err => console.error(`Request failed: ${err}`)) 16 17 req.write('writing some content...\n') 18 req.end('last write & close the stream') 19 req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) // writable-http-request.js 1 const http = require('http') 2 3 const req = http.request( 4 { 5 hostname: 'enx6b07hdu6cs.x.pipedream.net', 6 method: 'POST' 7 }, 8 resp => { 9 console.log(`Server responded with "${resp.statusCode}"`) 10 } 11 ) 12 13 14 15 16 17 req.write('writing some content...\n') 18 req.end('last write & close the stream') 19 @loige 41
  68. // writable-http-request.js const http = require('http') const req = http.request(

    { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...\n') req.end('last write & close the stream') 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 const req = http.request( ) // writable-http-request.js 1 const http = require('http') 2 3 4 { 5 hostname: 'enx6b07hdu6cs.x.pipedream.net', 6 method: 'POST' 7 }, 8 resp => { 9 console.log(`Server responded with "${resp.statusCode}"`) 10 } 11 12 13 req.on('finish', () => console.log('request sent')) 14 req.on('close', () => console.log('Connection closed')) 15 req.on('error', err => console.error(`Request failed: ${err}`)) 16 17 req.write('writing some content...\n') 18 req.end('last write & close the stream') 19 req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) // writable-http-request.js 1 const http = require('http') 2 3 const req = http.request( 4 { 5 hostname: 'enx6b07hdu6cs.x.pipedream.net', 6 method: 'POST' 7 }, 8 resp => { 9 console.log(`Server responded with "${resp.statusCode}"`) 10 } 11 ) 12 13 14 15 16 17 req.write('writing some content...\n') 18 req.end('last write & close the stream') 19 req.write('writing some content...\n') req.end('last write & close the stream') // writable-http-request.js 1 const http = require('http') 2 3 const req = http.request( 4 { 5 hostname: 'enx6b07hdu6cs.x.pipedream.net', 6 method: 'POST' 7 }, 8 resp => { 9 console.log(`Server responded with "${resp.statusCode}"`) 10 } 11 ) 12 13 req.on('finish', () => console.log('request sent')) 14 req.on('close', () => console.log('Connection closed')) 15 req.on('error', err => console.error(`Request failed: ${err}`)) 16 17 18 19 @loige 41
  69. @loige 42

  70. loige.link/writable-http-req @loige 43

  71. BACKPRESSURE BACKPRESSURE When writing large amounts of data you should

    make sure you handle the stop write signal and the drain event loige.link/backpressure @loige 44
  72. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const

    [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 @loige 45
  73. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const

    [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 const canContinue = destStream.write(data) // stream-copy-safe.js 1 2 const { createReadStream, createWriteStream } = require('fs') 3 4 const [, , src, dest] = process.argv 5 const srcStream = createReadStream(src) 6 const destStream = createWriteStream(dest) 7 8 srcStream.on('data', data => { 9 10 if (!canContinue) { 11 // we are overflowing the destination, we should pause 12 srcStream.pause() 13 // we will resume when the destination stream is drained 14 destStream.once('drain', () => srcStream.resume()) 15 } 16 }) 17 @loige 45
  74. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const

    [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 const canContinue = destStream.write(data) // stream-copy-safe.js 1 2 const { createReadStream, createWriteStream } = require('fs') 3 4 const [, , src, dest] = process.argv 5 const srcStream = createReadStream(src) 6 const destStream = createWriteStream(dest) 7 8 srcStream.on('data', data => { 9 10 if (!canContinue) { 11 // we are overflowing the destination, we should pause 12 srcStream.pause() 13 // we will resume when the destination stream is drained 14 destStream.once('drain', () => srcStream.resume()) 15 } 16 }) 17 if (!canContinue) { } // stream-copy-safe.js 1 2 const { createReadStream, createWriteStream } = require('fs') 3 4 const [, , src, dest] = process.argv 5 const srcStream = createReadStream(src) 6 const destStream = createWriteStream(dest) 7 8 srcStream.on('data', data => { 9 const canContinue = destStream.write(data) 10 11 // we are overflowing the destination, we should pause 12 srcStream.pause() 13 // we will resume when the destination stream is drained 14 destStream.once('drain', () => srcStream.resume()) 15 16 }) 17 @loige 45
  75. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const

    [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 const canContinue = destStream.write(data) // stream-copy-safe.js 1 2 const { createReadStream, createWriteStream } = require('fs') 3 4 const [, , src, dest] = process.argv 5 const srcStream = createReadStream(src) 6 const destStream = createWriteStream(dest) 7 8 srcStream.on('data', data => { 9 10 if (!canContinue) { 11 // we are overflowing the destination, we should pause 12 srcStream.pause() 13 // we will resume when the destination stream is drained 14 destStream.once('drain', () => srcStream.resume()) 15 } 16 }) 17 if (!canContinue) { } // stream-copy-safe.js 1 2 const { createReadStream, createWriteStream } = require('fs') 3 4 const [, , src, dest] = process.argv 5 const srcStream = createReadStream(src) 6 const destStream = createWriteStream(dest) 7 8 srcStream.on('data', data => { 9 const canContinue = destStream.write(data) 10 11 // we are overflowing the destination, we should pause 12 srcStream.pause() 13 // we will resume when the destination stream is drained 14 destStream.once('drain', () => srcStream.resume()) 15 16 }) 17 // we are overflowing the destination, we should pause srcStream.pause() // stream-copy-safe.js 1 2 const { createReadStream, createWriteStream } = require('fs') 3 4 const [, , src, dest] = process.argv 5 const srcStream = createReadStream(src) 6 const destStream = createWriteStream(dest) 7 8 srcStream.on('data', data => { 9 const canContinue = destStream.write(data) 10 if (!canContinue) { 11 12 13 // we will resume when the destination stream is drained 14 destStream.once('drain', () => srcStream.resume()) 15 } 16 }) 17 @loige 45
  76. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const

    [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 const canContinue = destStream.write(data) // stream-copy-safe.js 1 2 const { createReadStream, createWriteStream } = require('fs') 3 4 const [, , src, dest] = process.argv 5 const srcStream = createReadStream(src) 6 const destStream = createWriteStream(dest) 7 8 srcStream.on('data', data => { 9 10 if (!canContinue) { 11 // we are overflowing the destination, we should pause 12 srcStream.pause() 13 // we will resume when the destination stream is drained 14 destStream.once('drain', () => srcStream.resume()) 15 } 16 }) 17 if (!canContinue) { } // stream-copy-safe.js 1 2 const { createReadStream, createWriteStream } = require('fs') 3 4 const [, , src, dest] = process.argv 5 const srcStream = createReadStream(src) 6 const destStream = createWriteStream(dest) 7 8 srcStream.on('data', data => { 9 const canContinue = destStream.write(data) 10 11 // we are overflowing the destination, we should pause 12 srcStream.pause() 13 // we will resume when the destination stream is drained 14 destStream.once('drain', () => srcStream.resume()) 15 16 }) 17 // we are overflowing the destination, we should pause srcStream.pause() // stream-copy-safe.js 1 2 const { createReadStream, createWriteStream } = require('fs') 3 4 const [, , src, dest] = process.argv 5 const srcStream = createReadStream(src) 6 const destStream = createWriteStream(dest) 7 8 srcStream.on('data', data => { 9 const canContinue = destStream.write(data) 10 if (!canContinue) { 11 12 13 // we will resume when the destination stream is drained 14 destStream.once('drain', () => srcStream.resume()) 15 } 16 }) 17 // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) // stream-copy-safe.js 1 2 const { createReadStream, createWriteStream } = require('fs') 3 4 const [, , src, dest] = process.argv 5 const srcStream = createReadStream(src) 6 const destStream = createWriteStream(dest) 7 8 srcStream.on('data', data => { 9 const canContinue = destStream.write(data) 10 if (!canContinue) { 11 // we are overflowing the destination, we should pause 12 srcStream.pause() 13 14 15 } 16 }) 17 @loige 45
  77. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const

    [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 const canContinue = destStream.write(data) // stream-copy-safe.js 1 2 const { createReadStream, createWriteStream } = require('fs') 3 4 const [, , src, dest] = process.argv 5 const srcStream = createReadStream(src) 6 const destStream = createWriteStream(dest) 7 8 srcStream.on('data', data => { 9 10 if (!canContinue) { 11 // we are overflowing the destination, we should pause 12 srcStream.pause() 13 // we will resume when the destination stream is drained 14 destStream.once('drain', () => srcStream.resume()) 15 } 16 }) 17 if (!canContinue) { } // stream-copy-safe.js 1 2 const { createReadStream, createWriteStream } = require('fs') 3 4 const [, , src, dest] = process.argv 5 const srcStream = createReadStream(src) 6 const destStream = createWriteStream(dest) 7 8 srcStream.on('data', data => { 9 const canContinue = destStream.write(data) 10 11 // we are overflowing the destination, we should pause 12 srcStream.pause() 13 // we will resume when the destination stream is drained 14 destStream.once('drain', () => srcStream.resume()) 15 16 }) 17 // we are overflowing the destination, we should pause srcStream.pause() // stream-copy-safe.js 1 2 const { createReadStream, createWriteStream } = require('fs') 3 4 const [, , src, dest] = process.argv 5 const srcStream = createReadStream(src) 6 const destStream = createWriteStream(dest) 7 8 srcStream.on('data', data => { 9 const canContinue = destStream.write(data) 10 if (!canContinue) { 11 12 13 // we will resume when the destination stream is drained 14 destStream.once('drain', () => srcStream.resume()) 15 } 16 }) 17 // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) // stream-copy-safe.js 1 2 const { createReadStream, createWriteStream } = require('fs') 3 4 const [, , src, dest] = process.argv 5 const srcStream = createReadStream(src) 6 const destStream = createWriteStream(dest) 7 8 srcStream.on('data', data => { 9 const canContinue = destStream.write(data) 10 if (!canContinue) { 11 // we are overflowing the destination, we should pause 12 srcStream.pause() 13 14 15 } 16 }) 17 const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } // stream-copy-safe.js 1 2 const { createReadStream, createWriteStream } = require('fs') 3 4 const [, , src, dest] = process.argv 5 const srcStream = createReadStream(src) 6 const destStream = createWriteStream(dest) 7 8 srcStream.on('data', data => { 9 10 11 12 13 14 15 16 }) 17 @loige 45
  78. OTHER TYPES OF STREAM OTHER TYPES OF STREAM Duplex Stream

    streams that are both Readable and Writable. (net.Socket) Transform Stream Duplex streams that can modify or transform the data as it is written and read. (zlib.createGzip(), crypto.createCipheriv()) @loige 46
  79. ANATOMY OF A TRANSFORM STREAM ANATOMY OF A TRANSFORM STREAM

    transform stream @loige 47
  80. ANATOMY OF A TRANSFORM STREAM ANATOMY OF A TRANSFORM STREAM

    1. write data transform stream (readable stream) @loige 47
  81. ANATOMY OF A TRANSFORM STREAM ANATOMY OF A TRANSFORM STREAM

    1. write data transform stream 2. transform the data (readable stream) @loige 47
  82. ANATOMY OF A TRANSFORM STREAM ANATOMY OF A TRANSFORM STREAM

    1. write data transform stream 3. read transformed data 2. transform the data (readable stream) (writable stream) @loige 47
  83. GZIP EXAMPLE GZIP EXAMPLE 1. write data transform stream 3.

    read transformed data 2. transform the data (readable stream) (writable stream) @loige 48
  84. GZIP EXAMPLE GZIP EXAMPLE 1. write data transform stream 3.

    read transformed data 2. transform the data (readable stream) (writable stream) @loige Uncompressed data 48
  85. GZIP EXAMPLE GZIP EXAMPLE 1. write data transform stream 3.

    read transformed data 2. transform the data (readable stream) (writable stream) @loige Uncompressed data compress zlib.createGzip() 48
  86. GZIP EXAMPLE GZIP EXAMPLE 1. write data transform stream 3.

    read transformed data 2. transform the data (readable stream) (writable stream) @loige Uncompressed data Compressed data compress zlib.createGzip() 48
  87. HOW CAN WE USE TRANSFORM STREAMS? HOW CAN WE USE

    TRANSFORM STREAMS? Readable Transform Writable 49 @loige
  88. HOW CAN WE USE TRANSFORM STREAMS? HOW CAN WE USE

    TRANSFORM STREAMS? Readable Transform Writable ⚡ data 49 @loige
  89. HOW CAN WE USE TRANSFORM STREAMS? HOW CAN WE USE

    TRANSFORM STREAMS? Readable Transform Writable ⚡ data write() 49 @loige
  90. HOW CAN WE USE TRANSFORM STREAMS? HOW CAN WE USE

    TRANSFORM STREAMS? Readable Transform Writable ⚡ data write() ⚡ data 49 @loige
  91. HOW CAN WE USE TRANSFORM STREAMS? HOW CAN WE USE

    TRANSFORM STREAMS? Readable Transform Writable ⚡ data write() ⚡ data write() 49 @loige
  92. HOW CAN WE USE TRANSFORM STREAMS? HOW CAN WE USE

    TRANSFORM STREAMS? Readable Transform Writable ⚡ data write() ⚡ data write() 49 @loige (Backpressure)
  93. HOW CAN WE USE TRANSFORM STREAMS? HOW CAN WE USE

    TRANSFORM STREAMS? Readable Transform Writable ⚡ data write() ⚡ data write() pause() 49 @loige (Backpressure)
  94. HOW CAN WE USE TRANSFORM STREAMS? HOW CAN WE USE

    TRANSFORM STREAMS? Readable Transform Writable ⚡ data write() ⚡ data write() pause() ⚡ drain 49 @loige (Backpressure)
  95. HOW CAN WE USE TRANSFORM STREAMS? HOW CAN WE USE

    TRANSFORM STREAMS? Readable Transform Writable ⚡ data write() ⚡ data write() pause() ⚡ drain resume() 49 @loige (Backpressure)
  96. HOW CAN WE USE TRANSFORM STREAMS? HOW CAN WE USE

    TRANSFORM STREAMS? Readable Transform Writable ⚡ data write() ⚡ data write() pause() ⚡ drain resume() 49 @loige (Backpressure) (Backpressure)
  97. HOW CAN WE USE TRANSFORM STREAMS? HOW CAN WE USE

    TRANSFORM STREAMS? Readable Transform Writable ⚡ data write() ⚡ data write() pause() ⚡ drain resume() pause() 49 @loige (Backpressure) (Backpressure)
  98. HOW CAN WE USE TRANSFORM STREAMS? HOW CAN WE USE

    TRANSFORM STREAMS? Readable Transform Writable ⚡ data write() ⚡ data write() pause() ⚡ drain resume() pause() ⚡ drain 49 @loige (Backpressure) (Backpressure)
  99. HOW CAN WE USE TRANSFORM STREAMS? HOW CAN WE USE

    TRANSFORM STREAMS? Readable Transform Writable ⚡ data write() ⚡ data write() pause() ⚡ drain resume() pause() ⚡ drain resume() 49 @loige (Backpressure) (Backpressure)
  100. HOW CAN WE USE TRANSFORM STREAMS? HOW CAN WE USE

    TRANSFORM STREAMS? Readable Transform Writable ⚡ data write() ⚡ data write() pause() ⚡ drain resume() pause() ⚡ drain resume() 49 @loige (Backpressure) (Backpressure) You also have to handle end & error events!
  101. gzipStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue)

    { gzipStream.pause() destStream.once('drain', () => { gzipStream.resume() }) } }) gzipStream.on('end', () => { destStream.end() }) // ⚠ TODO: handle errors! // stream-copy-gzip.js const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = gzipStream.write(data) if (!canContinue) { srcStream.pause() gzipStream.once('drain', () => { srcStream.resume() }) } }) srcStream.on('end', () => { // check if there's buffered data left const remainingData = gzipStream.read() if (remainingData !== null) { destStream.write() } gzipStream.end() }) @loige 50
  102. gzipStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue)

    { gzipStream.pause() destStream.once('drain', () => { gzipStream.resume() }) } }) gzipStream.on('end', () => { destStream.end() }) // ⚠ TODO: handle errors! // stream-copy-gzip.js const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = gzipStream.write(data) if (!canContinue) { srcStream.pause() gzipStream.once('drain', () => { srcStream.resume() }) } }) srcStream.on('end', () => { // check if there's buffered data left const remainingData = gzipStream.read() if (remainingData !== null) { destStream.write() } gzipStream.end() }) @loige 50
  103. 03. PIPE() 03. PIPE() @loige 51

  104. readable .pipe(tranform1) .pipe(transform2) .pipe(transform3) .pipe(writable) readable.pipe(writableDest) @loige Connects a readable

    stream to a writable stream A transform stream can be used as a destination as well It returns the destination stream allowing for a chain of pipes 52
  105. // stream-copy-gzip-pipe.js const { createReadStream, createWriteStream } = require('fs') const

    { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream .pipe(gzipStream) .pipe(destStream) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 @loige 53
  106. // stream-copy-gzip-pipe.js const { createReadStream, createWriteStream } = require('fs') const

    { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream .pipe(gzipStream) .pipe(destStream) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 srcStream .pipe(gzipStream) .pipe(destStream) // stream-copy-gzip-pipe.js 1 2 const { 3 createReadStream, 4 createWriteStream 5 } = require('fs') 6 const { createGzip } = require('zlib') 7 8 const [, , src, dest] = process.argv 9 const srcStream = createReadStream(src) 10 const gzipStream = createGzip() 11 const destStream = createWriteStream(dest) 12 13 14 15 16 @loige 53
  107. readable .pipe(decompress) .pipe(decrypt) .pipe(convert) .pipe(encrypt) .pipe(compress) .pipe(writeToDisk) Setup complex pipelines

    with pipe @loige This is the most common way to use streams 54
  108. readable .on('error', handleErr) .pipe(decompress) .on('error', handleErr) .pipe(decrypt) .on('error', handleErr) .pipe(convert)

    .on('error', handleErr) .pipe(encrypt) .on('error', handleErr) .pipe(compress) .on('error', handleErr) .pipe(writeToDisk) .on('error', handleErr) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Handling errors (correctly) @loige 55
  109. readable .on('error', handleErr) .pipe(decompress) .on('error', handleErr) .pipe(decrypt) .on('error', handleErr) .pipe(convert)

    .on('error', handleErr) .pipe(encrypt) .on('error', handleErr) .pipe(compress) .on('error', handleErr) .pipe(writeToDisk) .on('error', handleErr) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 readable .pipe(decompress) .pipe(decrypt) .pipe(convert) .pipe(encrypt) .pipe(compress) .pipe(writeToDisk) 1 .on('error', handleErr) 2 3 .on('error', handleErr) 4 5 .on('error', handleErr) 6 7 .on('error', handleErr) 8 9 .on('error', handleErr) 10 11 .on('error', handleErr) 12 13 .on('error', handleErr) 14 Handling errors (correctly) @loige 55
  110. readable .on('error', handleErr) .pipe(decompress) .on('error', handleErr) .pipe(decrypt) .on('error', handleErr) .pipe(convert)

    .on('error', handleErr) .pipe(encrypt) .on('error', handleErr) .pipe(compress) .on('error', handleErr) .pipe(writeToDisk) .on('error', handleErr) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 readable .pipe(decompress) .pipe(decrypt) .pipe(convert) .pipe(encrypt) .pipe(compress) .pipe(writeToDisk) 1 .on('error', handleErr) 2 3 .on('error', handleErr) 4 5 .on('error', handleErr) 6 7 .on('error', handleErr) 8 9 .on('error', handleErr) 10 11 .on('error', handleErr) 12 13 .on('error', handleErr) 14 .on('error', handleErr) .on('error', handleErr) .on('error', handleErr) .on('error', handleErr) .on('error', handleErr) .on('error', handleErr) .on('error', handleErr) readable 1 2 .pipe(decompress) 3 4 .pipe(decrypt) 5 6 .pipe(convert) 7 8 .pipe(encrypt) 9 10 .pipe(compress) 11 12 .pipe(writeToDisk) 13 14 Handling errors (correctly) @loige handleErr should end and destroy the streams (it doesn't happen automatically) 55
  111. readable .on('error', handleErr) .pipe(decompress) .on('error', handleErr) .pipe(decrypt) .on('error', handleErr) .pipe(convert)

    .on('error', handleErr) .pipe(encrypt) .on('error', handleErr) .pipe(compress) .on('error', handleErr) .pipe(writeToDisk) .on('error', handleErr) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 readable .pipe(decompress) .pipe(decrypt) .pipe(convert) .pipe(encrypt) .pipe(compress) .pipe(writeToDisk) 1 .on('error', handleErr) 2 3 .on('error', handleErr) 4 5 .on('error', handleErr) 6 7 .on('error', handleErr) 8 9 .on('error', handleErr) 10 11 .on('error', handleErr) 12 13 .on('error', handleErr) 14 .on('error', handleErr) .on('error', handleErr) .on('error', handleErr) .on('error', handleErr) .on('error', handleErr) .on('error', handleErr) .on('error', handleErr) readable 1 2 .pipe(decompress) 3 4 .pipe(decrypt) 5 6 .pipe(convert) 7 8 .pipe(encrypt) 9 10 .pipe(compress) 11 12 .pipe(writeToDisk) 13 14 Handling errors (correctly) @loige handleErr should end and destroy the streams (it doesn't happen automatically) 55
  112. 04. STREAM UTILITIES 04. STREAM UTILITIES @loige 56

  113. // stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const {

    createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 stream.pipeline(...streams, callback) - Node.js 10+ @loige 57
  114. // stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const {

    createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 const { pipeline } = require('stream') pipeline( ) // stream-copy-gzip-pipeline.js 1 2 3 const { createReadStream, createWriteStream } = require('fs') 4 const { createGzip } = require('zlib') 5 6 const [, , src, dest] = process.argv 7 8 9 createReadStream(src), 10 createGzip(), 11 createWriteStream(dest), 12 function onEnd (err) { 13 if (err) { 14 console.error(`Error: ${err}`) 15 process.exit(1) 16 } 17 18 console.log('Done!') 19 } 20 21 stream.pipeline(...streams, callback) - Node.js 10+ @loige 57
  115. // stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const {

    createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 const { pipeline } = require('stream') pipeline( ) // stream-copy-gzip-pipeline.js 1 2 3 const { createReadStream, createWriteStream } = require('fs') 4 const { createGzip } = require('zlib') 5 6 const [, , src, dest] = process.argv 7 8 9 createReadStream(src), 10 createGzip(), 11 createWriteStream(dest), 12 function onEnd (err) { 13 if (err) { 14 console.error(`Error: ${err}`) 15 process.exit(1) 16 } 17 18 console.log('Done!') 19 } 20 21 createReadStream(src), createGzip(), createWriteStream(dest), // stream-copy-gzip-pipeline.js 1 2 const { pipeline } = require('stream') 3 const { createReadStream, createWriteStream } = require('fs') 4 const { createGzip } = require('zlib') 5 6 const [, , src, dest] = process.argv 7 8 pipeline( 9 10 11 12 function onEnd (err) { 13 if (err) { 14 console.error(`Error: ${err}`) 15 process.exit(1) 16 } 17 18 console.log('Done!') 19 } 20 ) 21 stream.pipeline(...streams, callback) - Node.js 10+ @loige You can pass multiple streams (they will be piped) 57
  116. // stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const {

    createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 const { pipeline } = require('stream') pipeline( ) // stream-copy-gzip-pipeline.js 1 2 3 const { createReadStream, createWriteStream } = require('fs') 4 const { createGzip } = require('zlib') 5 6 const [, , src, dest] = process.argv 7 8 9 createReadStream(src), 10 createGzip(), 11 createWriteStream(dest), 12 function onEnd (err) { 13 if (err) { 14 console.error(`Error: ${err}`) 15 process.exit(1) 16 } 17 18 console.log('Done!') 19 } 20 21 createReadStream(src), createGzip(), createWriteStream(dest), // stream-copy-gzip-pipeline.js 1 2 const { pipeline } = require('stream') 3 const { createReadStream, createWriteStream } = require('fs') 4 const { createGzip } = require('zlib') 5 6 const [, , src, dest] = process.argv 7 8 pipeline( 9 10 11 12 function onEnd (err) { 13 if (err) { 14 console.error(`Error: ${err}`) 15 process.exit(1) 16 } 17 18 console.log('Done!') 19 } 20 ) 21 function onEnd (err) { } // stream-copy-gzip-pipeline.js 1 2 const { pipeline } = require('stream') 3 const { createReadStream, createWriteStream } = require('fs') 4 const { createGzip } = require('zlib') 5 6 const [, , src, dest] = process.argv 7 8 pipeline( 9 createReadStream(src), 10 createGzip(), 11 createWriteStream(dest), 12 13 if (err) { 14 console.error(`Error: ${err}`) 15 process.exit(1) 16 } 17 18 console.log('Done!') 19 20 ) 21 stream.pipeline(...streams, callback) - Node.js 10+ @loige You can pass multiple streams (they will be piped) 57
  117. // stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const {

    createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 const { pipeline } = require('stream') pipeline( ) // stream-copy-gzip-pipeline.js 1 2 3 const { createReadStream, createWriteStream } = require('fs') 4 const { createGzip } = require('zlib') 5 6 const [, , src, dest] = process.argv 7 8 9 createReadStream(src), 10 createGzip(), 11 createWriteStream(dest), 12 function onEnd (err) { 13 if (err) { 14 console.error(`Error: ${err}`) 15 process.exit(1) 16 } 17 18 console.log('Done!') 19 } 20 21 createReadStream(src), createGzip(), createWriteStream(dest), // stream-copy-gzip-pipeline.js 1 2 const { pipeline } = require('stream') 3 const { createReadStream, createWriteStream } = require('fs') 4 const { createGzip } = require('zlib') 5 6 const [, , src, dest] = process.argv 7 8 pipeline( 9 10 11 12 function onEnd (err) { 13 if (err) { 14 console.error(`Error: ${err}`) 15 process.exit(1) 16 } 17 18 console.log('Done!') 19 } 20 ) 21 function onEnd (err) { } // stream-copy-gzip-pipeline.js 1 2 const { pipeline } = require('stream') 3 const { createReadStream, createWriteStream } = require('fs') 4 const { createGzip } = require('zlib') 5 6 const [, , src, dest] = process.argv 7 8 pipeline( 9 createReadStream(src), 10 createGzip(), 11 createWriteStream(dest), 12 13 if (err) { 14 console.error(`Error: ${err}`) 15 process.exit(1) 16 } 17 18 console.log('Done!') 19 20 ) 21 if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') // stream-copy-gzip-pipeline.js 1 2 const { pipeline } = require('stream') 3 const { createReadStream, createWriteStream } = require('fs') 4 const { createGzip } = require('zlib') 5 6 const [, , src, dest] = process.argv 7 8 pipeline( 9 createReadStream(src), 10 createGzip(), 11 createWriteStream(dest), 12 function onEnd (err) { 13 14 15 16 17 18 19 } 20 ) 21 stream.pipeline(...streams, callback) - Node.js 10+ @loige You can pass multiple streams (they will be piped) The last argument is a callback. If invoked with an error, it means the pipeline failed at some point. All the streams are ended and destroyed correctly. 57
  118. // stream-copy-gzip-pump.js const pump = require('pump') // from npm const

    { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pump( // just swap pipeline with pump! createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 For Node.js < 10: pump - npm.im/pump @loige 58
  119. // stream-copy-gzip-pump.js const pump = require('pump') // from npm const

    { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pump( // just swap pipeline with pump! createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 const pump = require('pump') // from npm pump( // just swap pipeline with pump! ) // stream-copy-gzip-pump.js 1 2 3 const { createReadStream, createWriteStream } = require('fs') 4 const { createGzip } = require('zlib') 5 6 const [, , src, dest] = process.argv 7 8 9 createReadStream(src), 10 createGzip(), 11 createWriteStream(dest), 12 function onEnd (err) { 13 if (err) { 14 console.error(`Error: ${err}`) 15 process.exit(1) 16 } 17 18 console.log('Done!') 19 } 20 21 For Node.js < 10: pump - npm.im/pump @loige 58
  120. pumpify(...streams) - Create reusable pieces of pipeline npm.im/pumpify @loige Let's

    create EncGz, an application that helps us to read and write encrypted- gzipped files 59
  121. // encgz-stream.js - utility library const { createCipheriv, createDecipheriv, randomBytes,

    createHash } = require('crypto') const { createGzip, createGunzip } = require('zlib') const pumpify = require('pumpify') // from npm // calculates md5 of the secret (trimmed) function getChiperKey (secret) {} function createEncgz (secret) { const initVect = randomBytes(16) const cipherKey = getChiperKey(secret) const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() const stream = pumpify(encryptStream, gzipStream) stream.initVect = initVect return stream } 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 @loige 60
  122. // encgz-stream.js - utility library const { createCipheriv, createDecipheriv, randomBytes,

    createHash } = require('crypto') const { createGzip, createGunzip } = require('zlib') const pumpify = require('pumpify') // from npm // calculates md5 of the secret (trimmed) function getChiperKey (secret) {} function createEncgz (secret) { const initVect = randomBytes(16) const cipherKey = getChiperKey(secret) const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() const stream = pumpify(encryptStream, gzipStream) stream.initVect = initVect return stream } 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 function createEncgz (secret) { } // encgz-stream.js - utility library 1 2 const { 3 createCipheriv, 4 createDecipheriv, 5 randomBytes, 6 createHash 7 } = require('crypto') 8 const { createGzip, createGunzip } = require('zlib') 9 const pumpify = require('pumpify') // from npm 10 11 // calculates md5 of the secret (trimmed) 12 function getChiperKey (secret) {} 13 14 15 const initVect = randomBytes(16) 16 const cipherKey = getChiperKey(secret) 17 const encryptStream = createCipheriv('aes256', cipherKey, initVect) 18 const gzipStream = createGzip() 19 20 const stream = pumpify(encryptStream, gzipStream) 21 stream.initVect = initVect 22 23 return stream 24 25 @loige 60
  123. // encgz-stream.js - utility library const { createCipheriv, createDecipheriv, randomBytes,

    createHash } = require('crypto') const { createGzip, createGunzip } = require('zlib') const pumpify = require('pumpify') // from npm // calculates md5 of the secret (trimmed) function getChiperKey (secret) {} function createEncgz (secret) { const initVect = randomBytes(16) const cipherKey = getChiperKey(secret) const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() const stream = pumpify(encryptStream, gzipStream) stream.initVect = initVect return stream } 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 function createEncgz (secret) { } // encgz-stream.js - utility library 1 2 const { 3 createCipheriv, 4 createDecipheriv, 5 randomBytes, 6 createHash 7 } = require('crypto') 8 const { createGzip, createGunzip } = require('zlib') 9 const pumpify = require('pumpify') // from npm 10 11 // calculates md5 of the secret (trimmed) 12 function getChiperKey (secret) {} 13 14 15 const initVect = randomBytes(16) 16 const cipherKey = getChiperKey(secret) 17 const encryptStream = createCipheriv('aes256', cipherKey, initVect) 18 const gzipStream = createGzip() 19 20 const stream = pumpify(encryptStream, gzipStream) 21 stream.initVect = initVect 22 23 return stream 24 25 const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() // encgz-stream.js - utility library 1 2 const { 3 createCipheriv, 4 createDecipheriv, 5 randomBytes, 6 createHash 7 } = require('crypto') 8 const { createGzip, createGunzip } = require('zlib') 9 const pumpify = require('pumpify') // from npm 10 11 // calculates md5 of the secret (trimmed) 12 function getChiperKey (secret) {} 13 14 function createEncgz (secret) { 15 const initVect = randomBytes(16) 16 const cipherKey = getChiperKey(secret) 17 18 19 20 const stream = pumpify(encryptStream, gzipStream) 21 stream.initVect = initVect 22 23 return stream 24 } 25 @loige 60
  124. // encgz-stream.js - utility library const { createCipheriv, createDecipheriv, randomBytes,

    createHash } = require('crypto') const { createGzip, createGunzip } = require('zlib') const pumpify = require('pumpify') // from npm // calculates md5 of the secret (trimmed) function getChiperKey (secret) {} function createEncgz (secret) { const initVect = randomBytes(16) const cipherKey = getChiperKey(secret) const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() const stream = pumpify(encryptStream, gzipStream) stream.initVect = initVect return stream } 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 function createEncgz (secret) { } // encgz-stream.js - utility library 1 2 const { 3 createCipheriv, 4 createDecipheriv, 5 randomBytes, 6 createHash 7 } = require('crypto') 8 const { createGzip, createGunzip } = require('zlib') 9 const pumpify = require('pumpify') // from npm 10 11 // calculates md5 of the secret (trimmed) 12 function getChiperKey (secret) {} 13 14 15 const initVect = randomBytes(16) 16 const cipherKey = getChiperKey(secret) 17 const encryptStream = createCipheriv('aes256', cipherKey, initVect) 18 const gzipStream = createGzip() 19 20 const stream = pumpify(encryptStream, gzipStream) 21 stream.initVect = initVect 22 23 return stream 24 25 const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() // encgz-stream.js - utility library 1 2 const { 3 createCipheriv, 4 createDecipheriv, 5 randomBytes, 6 createHash 7 } = require('crypto') 8 const { createGzip, createGunzip } = require('zlib') 9 const pumpify = require('pumpify') // from npm 10 11 // calculates md5 of the secret (trimmed) 12 function getChiperKey (secret) {} 13 14 function createEncgz (secret) { 15 const initVect = randomBytes(16) 16 const cipherKey = getChiperKey(secret) 17 18 19 20 const stream = pumpify(encryptStream, gzipStream) 21 stream.initVect = initVect 22 23 return stream 24 } 25 const stream = pumpify(encryptStream, gzipStream) // encgz-stream.js - utility library 1 2 const { 3 createCipheriv, 4 createDecipheriv, 5 randomBytes, 6 createHash 7 } = require('crypto') 8 const { createGzip, createGunzip } = require('zlib') 9 const pumpify = require('pumpify') // from npm 10 11 // calculates md5 of the secret (trimmed) 12 function getChiperKey (secret) {} 13 14 function createEncgz (secret) { 15 const initVect = randomBytes(16) 16 const cipherKey = getChiperKey(secret) 17 const encryptStream = createCipheriv('aes256', cipherKey, initVect) 18 const gzipStream = createGzip() 19 20 21 stream.initVect = initVect 22 23 return stream 24 } 25 @loige 60
  125. // encgz-stream.js - utility library const { createCipheriv, createDecipheriv, randomBytes,

    createHash } = require('crypto') const { createGzip, createGunzip } = require('zlib') const pumpify = require('pumpify') // from npm // calculates md5 of the secret (trimmed) function getChiperKey (secret) {} function createEncgz (secret) { const initVect = randomBytes(16) const cipherKey = getChiperKey(secret) const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() const stream = pumpify(encryptStream, gzipStream) stream.initVect = initVect return stream } 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 function createEncgz (secret) { } // encgz-stream.js - utility library 1 2 const { 3 createCipheriv, 4 createDecipheriv, 5 randomBytes, 6 createHash 7 } = require('crypto') 8 const { createGzip, createGunzip } = require('zlib') 9 const pumpify = require('pumpify') // from npm 10 11 // calculates md5 of the secret (trimmed) 12 function getChiperKey (secret) {} 13 14 15 const initVect = randomBytes(16) 16 const cipherKey = getChiperKey(secret) 17 const encryptStream = createCipheriv('aes256', cipherKey, initVect) 18 const gzipStream = createGzip() 19 20 const stream = pumpify(encryptStream, gzipStream) 21 stream.initVect = initVect 22 23 return stream 24 25 const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() // encgz-stream.js - utility library 1 2 const { 3 createCipheriv, 4 createDecipheriv, 5 randomBytes, 6 createHash 7 } = require('crypto') 8 const { createGzip, createGunzip } = require('zlib') 9 const pumpify = require('pumpify') // from npm 10 11 // calculates md5 of the secret (trimmed) 12 function getChiperKey (secret) {} 13 14 function createEncgz (secret) { 15 const initVect = randomBytes(16) 16 const cipherKey = getChiperKey(secret) 17 18 19 20 const stream = pumpify(encryptStream, gzipStream) 21 stream.initVect = initVect 22 23 return stream 24 } 25 const stream = pumpify(encryptStream, gzipStream) // encgz-stream.js - utility library 1 2 const { 3 createCipheriv, 4 createDecipheriv, 5 randomBytes, 6 createHash 7 } = require('crypto') 8 const { createGzip, createGunzip } = require('zlib') 9 const pumpify = require('pumpify') // from npm 10 11 // calculates md5 of the secret (trimmed) 12 function getChiperKey (secret) {} 13 14 function createEncgz (secret) { 15 const initVect = randomBytes(16) 16 const cipherKey = getChiperKey(secret) 17 const encryptStream = createCipheriv('aes256', cipherKey, initVect) 18 const gzipStream = createGzip() 19 20 21 stream.initVect = initVect 22 23 return stream 24 } 25 return stream // encgz-stream.js - utility library 1 2 const { 3 createCipheriv, 4 createDecipheriv, 5 randomBytes, 6 createHash 7 } = require('crypto') 8 const { createGzip, createGunzip } = require('zlib') 9 const pumpify = require('pumpify') // from npm 10 11 // calculates md5 of the secret (trimmed) 12 function getChiperKey (secret) {} 13 14 function createEncgz (secret) { 15 const initVect = randomBytes(16) 16 const cipherKey = getChiperKey(secret) 17 const encryptStream = createCipheriv('aes256', cipherKey, initVect) 18 const gzipStream = createGzip() 19 20 const stream = pumpify(encryptStream, gzipStream) 21 stream.initVect = initVect 22 23 24 } 25 @loige 60
  126. // encgz-stream.js (...continue from previous slide) function createDecgz (secret, initVect)

    { const cipherKey = getChiperKey(secret) const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() const stream = pumpify(gunzipStream, decryptStream) return stream } module.exports = { createEncgz, createDecgz } 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 @loige 61
  127. // encgz-stream.js (...continue from previous slide) function createDecgz (secret, initVect)

    { const cipherKey = getChiperKey(secret) const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() const stream = pumpify(gunzipStream, decryptStream) return stream } module.exports = { createEncgz, createDecgz } 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 function createDecgz (secret, initVect) { } // encgz-stream.js (...continue from previous slide) 1 2 3 const cipherKey = getChiperKey(secret) 4 const decryptStream = createDecipheriv('aes256', cipherKey, initVect) 5 const gunzipStream = createGunzip() 6 7 const stream = pumpify(gunzipStream, decryptStream) 8 return stream 9 10 11 module.exports = { 12 createEncgz, 13 createDecgz 14 } 15 @loige 61
  128. // encgz-stream.js (...continue from previous slide) function createDecgz (secret, initVect)

    { const cipherKey = getChiperKey(secret) const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() const stream = pumpify(gunzipStream, decryptStream) return stream } module.exports = { createEncgz, createDecgz } 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 function createDecgz (secret, initVect) { } // encgz-stream.js (...continue from previous slide) 1 2 3 const cipherKey = getChiperKey(secret) 4 const decryptStream = createDecipheriv('aes256', cipherKey, initVect) 5 const gunzipStream = createGunzip() 6 7 const stream = pumpify(gunzipStream, decryptStream) 8 return stream 9 10 11 module.exports = { 12 createEncgz, 13 createDecgz 14 } 15 const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() // encgz-stream.js (...continue from previous slide) 1 2 function createDecgz (secret, initVect) { 3 const cipherKey = getChiperKey(secret) 4 5 6 7 const stream = pumpify(gunzipStream, decryptStream) 8 return stream 9 } 10 11 module.exports = { 12 createEncgz, 13 createDecgz 14 } 15 @loige 61
  129. // encgz-stream.js (...continue from previous slide) function createDecgz (secret, initVect)

    { const cipherKey = getChiperKey(secret) const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() const stream = pumpify(gunzipStream, decryptStream) return stream } module.exports = { createEncgz, createDecgz } 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 function createDecgz (secret, initVect) { } // encgz-stream.js (...continue from previous slide) 1 2 3 const cipherKey = getChiperKey(secret) 4 const decryptStream = createDecipheriv('aes256', cipherKey, initVect) 5 const gunzipStream = createGunzip() 6 7 const stream = pumpify(gunzipStream, decryptStream) 8 return stream 9 10 11 module.exports = { 12 createEncgz, 13 createDecgz 14 } 15 const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() // encgz-stream.js (...continue from previous slide) 1 2 function createDecgz (secret, initVect) { 3 const cipherKey = getChiperKey(secret) 4 5 6 7 const stream = pumpify(gunzipStream, decryptStream) 8 return stream 9 } 10 11 module.exports = { 12 createEncgz, 13 createDecgz 14 } 15 const stream = pumpify(gunzipStream, decryptStream) return stream // encgz-stream.js (...continue from previous slide) 1 2 function createDecgz (secret, initVect) { 3 const cipherKey = getChiperKey(secret) 4 const decryptStream = createDecipheriv('aes256', cipherKey, initVect) 5 const gunzipStream = createGunzip() 6 7 8 9 } 10 11 module.exports = { 12 createEncgz, 13 createDecgz 14 } 15 @loige 61
  130. // encgz-stream.js (...continue from previous slide) function createDecgz (secret, initVect)

    { const cipherKey = getChiperKey(secret) const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() const stream = pumpify(gunzipStream, decryptStream) return stream } module.exports = { createEncgz, createDecgz } 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 function createDecgz (secret, initVect) { } // encgz-stream.js (...continue from previous slide) 1 2 3 const cipherKey = getChiperKey(secret) 4 const decryptStream = createDecipheriv('aes256', cipherKey, initVect) 5 const gunzipStream = createGunzip() 6 7 const stream = pumpify(gunzipStream, decryptStream) 8 return stream 9 10 11 module.exports = { 12 createEncgz, 13 createDecgz 14 } 15 const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() // encgz-stream.js (...continue from previous slide) 1 2 function createDecgz (secret, initVect) { 3 const cipherKey = getChiperKey(secret) 4 5 6 7 const stream = pumpify(gunzipStream, decryptStream) 8 return stream 9 } 10 11 module.exports = { 12 createEncgz, 13 createDecgz 14 } 15 const stream = pumpify(gunzipStream, decryptStream) return stream // encgz-stream.js (...continue from previous slide) 1 2 function createDecgz (secret, initVect) { 3 const cipherKey = getChiperKey(secret) 4 const decryptStream = createDecipheriv('aes256', cipherKey, initVect) 5 const gunzipStream = createGunzip() 6 7 8 9 } 10 11 module.exports = { 12 createEncgz, 13 createDecgz 14 } 15 module.exports = { createEncgz, createDecgz } // encgz-stream.js (...continue from previous slide) 1 2 function createDecgz (secret, initVect) { 3 const cipherKey = getChiperKey(secret) 4 const decryptStream = createDecipheriv('aes256', cipherKey, initVect) 5 const gunzipStream = createGunzip() 6 7 const stream = pumpify(gunzipStream, decryptStream) 8 return stream 9 } 10 11 12 13 14 15 @loige 61
  131. // encgz.js - CLI to encrypt and gzip (from stdin

    to stdout) const { pipeline } = require('stream') const { createEncgz } = require('./encgz-stream') const [, , secret] = process.argv const encgz = createEncgz(secret) console.error(`init vector: ${encgz.initVect.toString('hex')}`) pipeline( process.stdin, encgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 @loige 62
  132. // encgz.js - CLI to encrypt and gzip (from stdin

    to stdout) const { pipeline } = require('stream') const { createEncgz } = require('./encgz-stream') const [, , secret] = process.argv const encgz = createEncgz(secret) console.error(`init vector: ${encgz.initVect.toString('hex')}`) pipeline( process.stdin, encgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 const [, , secret] = process.argv // encgz.js - CLI to encrypt and gzip (from stdin to stdout) 1 2 const { pipeline } = require('stream') 3 const { createEncgz } = require('./encgz-stream') 4 5 6 7 const encgz = createEncgz(secret) 8 console.error(`init vector: ${encgz.initVect.toString('hex')}`) 9 10 pipeline( 11 process.stdin, 12 encgz, 13 process.stdout, 14 function onEnd (err) { 15 if (err) { 16 console.error(`Error: ${err}`) 17 process.exit(1) 18 } 19 } 20 ) 21 @loige 62
  133. // encgz.js - CLI to encrypt and gzip (from stdin

    to stdout) const { pipeline } = require('stream') const { createEncgz } = require('./encgz-stream') const [, , secret] = process.argv const encgz = createEncgz(secret) console.error(`init vector: ${encgz.initVect.toString('hex')}`) pipeline( process.stdin, encgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 const [, , secret] = process.argv // encgz.js - CLI to encrypt and gzip (from stdin to stdout) 1 2 const { pipeline } = require('stream') 3 const { createEncgz } = require('./encgz-stream') 4 5 6 7 const encgz = createEncgz(secret) 8 console.error(`init vector: ${encgz.initVect.toString('hex')}`) 9 10 pipeline( 11 process.stdin, 12 encgz, 13 process.stdout, 14 function onEnd (err) { 15 if (err) { 16 console.error(`Error: ${err}`) 17 process.exit(1) 18 } 19 } 20 ) 21 const encgz = createEncgz(secret) // encgz.js - CLI to encrypt and gzip (from stdin to stdout) 1 2 const { pipeline } = require('stream') 3 const { createEncgz } = require('./encgz-stream') 4 5 const [, , secret] = process.argv 6 7 8 console.error(`init vector: ${encgz.initVect.toString('hex')}`) 9 10 pipeline( 11 process.stdin, 12 encgz, 13 process.stdout, 14 function onEnd (err) { 15 if (err) { 16 console.error(`Error: ${err}`) 17 process.exit(1) 18 } 19 } 20 ) 21 @loige 62
  134. // encgz.js - CLI to encrypt and gzip (from stdin

    to stdout) const { pipeline } = require('stream') const { createEncgz } = require('./encgz-stream') const [, , secret] = process.argv const encgz = createEncgz(secret) console.error(`init vector: ${encgz.initVect.toString('hex')}`) pipeline( process.stdin, encgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 const [, , secret] = process.argv // encgz.js - CLI to encrypt and gzip (from stdin to stdout) 1 2 const { pipeline } = require('stream') 3 const { createEncgz } = require('./encgz-stream') 4 5 6 7 const encgz = createEncgz(secret) 8 console.error(`init vector: ${encgz.initVect.toString('hex')}`) 9 10 pipeline( 11 process.stdin, 12 encgz, 13 process.stdout, 14 function onEnd (err) { 15 if (err) { 16 console.error(`Error: ${err}`) 17 process.exit(1) 18 } 19 } 20 ) 21 const encgz = createEncgz(secret) // encgz.js - CLI to encrypt and gzip (from stdin to stdout) 1 2 const { pipeline } = require('stream') 3 const { createEncgz } = require('./encgz-stream') 4 5 const [, , secret] = process.argv 6 7 8 console.error(`init vector: ${encgz.initVect.toString('hex')}`) 9 10 pipeline( 11 process.stdin, 12 encgz, 13 process.stdout, 14 function onEnd (err) { 15 if (err) { 16 console.error(`Error: ${err}`) 17 process.exit(1) 18 } 19 } 20 ) 21 pipeline( ) // encgz.js - CLI to encrypt and gzip (from stdin to stdout) 1 2 const { pipeline } = require('stream') 3 const { createEncgz } = require('./encgz-stream') 4 5 const [, , secret] = process.argv 6 7 const encgz = createEncgz(secret) 8 console.error(`init vector: ${encgz.initVect.toString('hex')}`) 9 10 11 process.stdin, 12 encgz, 13 process.stdout, 14 function onEnd (err) { 15 if (err) { 16 console.error(`Error: ${err}`) 17 process.exit(1) 18 } 19 } 20 21 @loige 62
  135. // encgz.js - CLI to encrypt and gzip (from stdin

    to stdout) const { pipeline } = require('stream') const { createEncgz } = require('./encgz-stream') const [, , secret] = process.argv const encgz = createEncgz(secret) console.error(`init vector: ${encgz.initVect.toString('hex')}`) pipeline( process.stdin, encgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 const [, , secret] = process.argv // encgz.js - CLI to encrypt and gzip (from stdin to stdout) 1 2 const { pipeline } = require('stream') 3 const { createEncgz } = require('./encgz-stream') 4 5 6 7 const encgz = createEncgz(secret) 8 console.error(`init vector: ${encgz.initVect.toString('hex')}`) 9 10 pipeline( 11 process.stdin, 12 encgz, 13 process.stdout, 14 function onEnd (err) { 15 if (err) { 16 console.error(`Error: ${err}`) 17 process.exit(1) 18 } 19 } 20 ) 21 const encgz = createEncgz(secret) // encgz.js - CLI to encrypt and gzip (from stdin to stdout) 1 2 const { pipeline } = require('stream') 3 const { createEncgz } = require('./encgz-stream') 4 5 const [, , secret] = process.argv 6 7 8 console.error(`init vector: ${encgz.initVect.toString('hex')}`) 9 10 pipeline( 11 process.stdin, 12 encgz, 13 process.stdout, 14 function onEnd (err) { 15 if (err) { 16 console.error(`Error: ${err}`) 17 process.exit(1) 18 } 19 } 20 ) 21 pipeline( ) // encgz.js - CLI to encrypt and gzip (from stdin to stdout) 1 2 const { pipeline } = require('stream') 3 const { createEncgz } = require('./encgz-stream') 4 5 const [, , secret] = process.argv 6 7 const encgz = createEncgz(secret) 8 console.error(`init vector: ${encgz.initVect.toString('hex')}`) 9 10 11 process.stdin, 12 encgz, 13 process.stdout, 14 function onEnd (err) { 15 if (err) { 16 console.error(`Error: ${err}`) 17 process.exit(1) 18 } 19 } 20 21 process.stdin, encgz, process.stdout, // encgz.js - CLI to encrypt and gzip (from stdin to stdout) 1 2 const { pipeline } = require('stream') 3 const { createEncgz } = require('./encgz-stream') 4 5 const [, , secret] = process.argv 6 7 const encgz = createEncgz(secret) 8 console.error(`init vector: ${encgz.initVect.toString('hex')}`) 9 10 pipeline( 11 12 13 14 function onEnd (err) { 15 if (err) { 16 console.error(`Error: ${err}`) 17 process.exit(1) 18 } 19 } 20 ) 21 @loige 62
  136. // decgz.js - CLI to gunzip and decrypt (from stdin

    to stdout) const { pipeline } = require('stream') const { createDecgz } = require('./encgz-stream') const [, , secret, initVect] = process.argv const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) pipeline( process.stdin, decgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 @loige 63
  137. // decgz.js - CLI to gunzip and decrypt (from stdin

    to stdout) const { pipeline } = require('stream') const { createDecgz } = require('./encgz-stream') const [, , secret, initVect] = process.argv const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) pipeline( process.stdin, decgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 const [, , secret, initVect] = process.argv // decgz.js - CLI to gunzip and decrypt (from stdin to stdout) 1 2 const { pipeline } = require('stream') 3 const { createDecgz } = require('./encgz-stream') 4 5 6 7 const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) 8 9 10 pipeline( 11 process.stdin, 12 decgz, 13 process.stdout, 14 function onEnd (err) { 15 if (err) { 16 console.error(`Error: ${err}`) 17 process.exit(1) 18 } 19 } 20 ) 21 @loige 63
  138. // decgz.js - CLI to gunzip and decrypt (from stdin

    to stdout) const { pipeline } = require('stream') const { createDecgz } = require('./encgz-stream') const [, , secret, initVect] = process.argv const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) pipeline( process.stdin, decgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 const [, , secret, initVect] = process.argv // decgz.js - CLI to gunzip and decrypt (from stdin to stdout) 1 2 const { pipeline } = require('stream') 3 const { createDecgz } = require('./encgz-stream') 4 5 6 7 const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) 8 9 10 pipeline( 11 process.stdin, 12 decgz, 13 process.stdout, 14 function onEnd (err) { 15 if (err) { 16 console.error(`Error: ${err}`) 17 process.exit(1) 18 } 19 } 20 ) 21 const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) // decgz.js - CLI to gunzip and decrypt (from stdin to stdout) 1 2 const { pipeline } = require('stream') 3 const { createDecgz } = require('./encgz-stream') 4 5 const [, , secret, initVect] = process.argv 6 7 8 9 10 pipeline( 11 process.stdin, 12 decgz, 13 process.stdout, 14 function onEnd (err) { 15 if (err) { 16 console.error(`Error: ${err}`) 17 process.exit(1) 18 } 19 } 20 ) 21 @loige 63
  139. // decgz.js - CLI to gunzip and decrypt (from stdin

    to stdout) const { pipeline } = require('stream') const { createDecgz } = require('./encgz-stream') const [, , secret, initVect] = process.argv const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) pipeline( process.stdin, decgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 const [, , secret, initVect] = process.argv // decgz.js - CLI to gunzip and decrypt (from stdin to stdout) 1 2 const { pipeline } = require('stream') 3 const { createDecgz } = require('./encgz-stream') 4 5 6 7 const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) 8 9 10 pipeline( 11 process.stdin, 12 decgz, 13 process.stdout, 14 function onEnd (err) { 15 if (err) { 16 console.error(`Error: ${err}`) 17 process.exit(1) 18 } 19 } 20 ) 21 const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) // decgz.js - CLI to gunzip and decrypt (from stdin to stdout) 1 2 const { pipeline } = require('stream') 3 const { createDecgz } = require('./encgz-stream') 4 5 const [, , secret, initVect] = process.argv 6 7 8 9 10 pipeline( 11 process.stdin, 12 decgz, 13 process.stdout, 14 function onEnd (err) { 15 if (err) { 16 console.error(`Error: ${err}`) 17 process.exit(1) 18 } 19 } 20 ) 21 pipeline( ) // decgz.js - CLI to gunzip and decrypt (from stdin to stdout) 1 2 const { pipeline } = require('stream') 3 const { createDecgz } = require('./encgz-stream') 4 5 const [, , secret, initVect] = process.argv 6 7 const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) 8 9 10 11 process.stdin, 12 decgz, 13 process.stdout, 14 function onEnd (err) { 15 if (err) { 16 console.error(`Error: ${err}`) 17 process.exit(1) 18 } 19 } 20 21 @loige 63
  140. // decgz.js - CLI to gunzip and decrypt (from stdin

    to stdout) const { pipeline } = require('stream') const { createDecgz } = require('./encgz-stream') const [, , secret, initVect] = process.argv const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) pipeline( process.stdin, decgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 const [, , secret, initVect] = process.argv // decgz.js - CLI to gunzip and decrypt (from stdin to stdout) 1 2 const { pipeline } = require('stream') 3 const { createDecgz } = require('./encgz-stream') 4 5 6 7 const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) 8 9 10 pipeline( 11 process.stdin, 12 decgz, 13 process.stdout, 14 function onEnd (err) { 15 if (err) { 16 console.error(`Error: ${err}`) 17 process.exit(1) 18 } 19 } 20 ) 21 const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) // decgz.js - CLI to gunzip and decrypt (from stdin to stdout) 1 2 const { pipeline } = require('stream') 3 const { createDecgz } = require('./encgz-stream') 4 5 const [, , secret, initVect] = process.argv 6 7 8 9 10 pipeline( 11 process.stdin, 12 decgz, 13 process.stdout, 14 function onEnd (err) { 15 if (err) { 16 console.error(`Error: ${err}`) 17 process.exit(1) 18 } 19 } 20 ) 21 pipeline( ) // decgz.js - CLI to gunzip and decrypt (from stdin to stdout) 1 2 const { pipeline } = require('stream') 3 const { createDecgz } = require('./encgz-stream') 4 5 const [, , secret, initVect] = process.argv 6 7 const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) 8 9 10 11 process.stdin, 12 decgz, 13 process.stdout, 14 function onEnd (err) { 15 if (err) { 16 console.error(`Error: ${err}`) 17 process.exit(1) 18 } 19 } 20 21 process.stdin, decgz, process.stdout, // decgz.js - CLI to gunzip and decrypt (from stdin to stdout) 1 2 const { pipeline } = require('stream') 3 const { createDecgz } = require('./encgz-stream') 4 5 const [, , secret, initVect] = process.argv 6 7 const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) 8 9 10 pipeline( 11 12 13 14 function onEnd (err) { 15 if (err) { 16 console.error(`Error: ${err}`) 17 process.exit(1) 18 } 19 } 20 ) 21 @loige 63
  141. @loige 64

  142. readable-stream - Npm package that contains the latest version of

    Node.js stream library. It also makes Node.js streams compatible with the browser (can be used with Webpack and Broswserify) npm.im/readable-stream @loige * yeah, the name is misleading. The package offers all the functionalities in the official 'stream' package, not just readable streams. * 65
  143. 04. WRITING CUSTOM 04. WRITING CUSTOM STREAMS STREAMS @loige 66

  144. @loige EmojiStream Uppercasify DOMAppend 67

  145. @loige EmojiStream Uppercasify DOMAppend Lemon 67

  146. @loige EmojiStream Uppercasify DOMAppend LEMON 67

  147. @loige EmojiStream Uppercasify DOMAppend LEMON 67

  148. @loige EmojiStream Uppercasify DOMAppend LEMON Banana 67

  149. @loige EmojiStream Uppercasify DOMAppend LEMON BANANA 67

  150. @loige EmojiStream Uppercasify DOMAppend LEMON BANANA 67

  151. @loige EmojiStream Uppercasify DOMAppend LEMON class EmojiStream extends Readable {

    _read() { // ... } } BANANA 67
  152. @loige EmojiStream Uppercasify DOMAppend LEMON class EmojiStream extends Readable {

    _read() { // ... } } class Uppercasify extends Transform { _transform( chunk, enc, done ) { // ... } } BANANA 67
  153. @loige EmojiStream Uppercasify DOMAppend LEMON class EmojiStream extends Readable {

    _read() { // ... } } class Uppercasify extends Transform { _transform( chunk, enc, done ) { // ... } } class DOMAppend extends Writable { _write( chunk, enc, done ) { // ... } } BANANA 67
  154. @loige EmojiStream Uppercasify DOMAppend LEMON class EmojiStream extends Readable {

    _read() { // ... } } class Uppercasify extends Transform { _transform( chunk, enc, done ) { // ... } } class DOMAppend extends Writable { _write( chunk, enc, done ) { // ... } } BANANA this.push(data) pass data to the next step 67
  155. // emoji-stream.js (custom readable stream) const { EMOJI_MAP } =

    require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 @loige 68
  156. // emoji-stream.js (custom readable stream) const { EMOJI_MAP } =

    require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 const { Readable } = require('readable-stream') // from npm // emoji-stream.js (custom readable stream) 1 const { EMOJI_MAP } = require('emoji') // from npm 2 3 const emojis = Object.keys(EMOJI_MAP) 4 function getEmojiDescription (index) { 5 return EMOJI_MAP[emojis[index]][1] 6 } 7 function getMessage (index) { 8 return emojis[index] + ' ' + getEmojiDescription(index) 9 } 10 11 class EmojiStream extends Readable { 12 constructor (options) { 13 super(options) 14 this._index = 0 15 } 16 17 _read () { 18 if (this._index >= emojis.length) { 19 return this.push(null) 20 } 21 return this.push(getMessage(this._index++)) 22 } 23 } 24 25 module.exports = EmojiStream 26 @loige 68
  157. // emoji-stream.js (custom readable stream) const { EMOJI_MAP } =

    require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 const { Readable } = require('readable-stream') // from npm // emoji-stream.js (custom readable stream) 1 const { EMOJI_MAP } = require('emoji') // from npm 2 3 const emojis = Object.keys(EMOJI_MAP) 4 function getEmojiDescription (index) { 5 return EMOJI_MAP[emojis[index]][1] 6 } 7 function getMessage (index) { 8 return emojis[index] + ' ' + getEmojiDescription(index) 9 } 10 11 class EmojiStream extends Readable { 12 constructor (options) { 13 super(options) 14 this._index = 0 15 } 16 17 _read () { 18 if (this._index >= emojis.length) { 19 return this.push(null) 20 } 21 return this.push(getMessage(this._index++)) 22 } 23 } 24 25 module.exports = EmojiStream 26 class EmojiStream extends Readable { } // emoji-stream.js (custom readable stream) 1 const { EMOJI_MAP } = require('emoji') // from npm 2 const { Readable } = require('readable-stream') // from npm 3 const emojis = Object.keys(EMOJI_MAP) 4 function getEmojiDescription (index) { 5 return EMOJI_MAP[emojis[index]][1] 6 } 7 function getMessage (index) { 8 return emojis[index] + ' ' + getEmojiDescription(index) 9 } 10 11 12 constructor (options) { 13 super(options) 14 this._index = 0 15 } 16 17 _read () { 18 if (this._index >= emojis.length) { 19 return this.push(null) 20 } 21 return this.push(getMessage(this._index++)) 22 } 23 24 25 module.exports = EmojiStream 26 @loige 68
  158. // emoji-stream.js (custom readable stream) const { EMOJI_MAP } =

    require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 const { Readable } = require('readable-stream') // from npm // emoji-stream.js (custom readable stream) 1 const { EMOJI_MAP } = require('emoji') // from npm 2 3 const emojis = Object.keys(EMOJI_MAP) 4 function getEmojiDescription (index) { 5 return EMOJI_MAP[emojis[index]][1] 6 } 7 function getMessage (index) { 8 return emojis[index] + ' ' + getEmojiDescription(index) 9 } 10 11 class EmojiStream extends Readable { 12 constructor (options) { 13 super(options) 14 this._index = 0 15 } 16 17 _read () { 18 if (this._index >= emojis.length) { 19 return this.push(null) 20 } 21 return this.push(getMessage(this._index++)) 22 } 23 } 24 25 module.exports = EmojiStream 26 class EmojiStream extends Readable { } // emoji-stream.js (custom readable stream) 1 const { EMOJI_MAP } = require('emoji') // from npm 2 const { Readable } = require('readable-stream') // from npm 3 const emojis = Object.keys(EMOJI_MAP) 4 function getEmojiDescription (index) { 5 return EMOJI_MAP[emojis[index]][1] 6 } 7 function getMessage (index) { 8 return emojis[index] + ' ' + getEmojiDescription(index) 9 } 10 11 12 constructor (options) { 13 super(options) 14 this._index = 0 15 } 16 17 _read () { 18 if (this._index >= emojis.length) { 19 return this.push(null) 20 } 21 return this.push(getMessage(this._index++)) 22 } 23 24 25 module.exports = EmojiStream 26 _read () { } // emoji-stream.js (custom readable stream) 1 const { EMOJI_MAP } = require('emoji') // from npm 2 const { Readable } = require('readable-stream') // from npm 3 const emojis = Object.keys(EMOJI_MAP) 4 function getEmojiDescription (index) { 5 return EMOJI_MAP[emojis[index]][1] 6 } 7 function getMessage (index) { 8 return emojis[index] + ' ' + getEmojiDescription(index) 9 } 10 11 class EmojiStream extends Readable { 12 constructor (options) { 13 super(options) 14 this._index = 0 15 } 16 17 18 if (this._index >= emojis.length) { 19 return this.push(null) 20 } 21 return this.push(getMessage(this._index++)) 22 23 } 24 25 module.exports = EmojiStream 26 @loige 68
  159. // emoji-stream.js (custom readable stream) const { EMOJI_MAP } =

    require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 const { Readable } = require('readable-stream') // from npm // emoji-stream.js (custom readable stream) 1 const { EMOJI_MAP } = require('emoji') // from npm 2 3 const emojis = Object.keys(EMOJI_MAP) 4 function getEmojiDescription (index) { 5 return EMOJI_MAP[emojis[index]][1] 6 } 7 function getMessage (index) { 8 return emojis[index] + ' ' + getEmojiDescription(index) 9 } 10 11 class EmojiStream extends Readable { 12 constructor (options) { 13 super(options) 14 this._index = 0 15 } 16 17 _read () { 18 if (this._index >= emojis.length) { 19 return this.push(null) 20 } 21 return this.push(getMessage(this._index++)) 22 } 23 } 24 25 module.exports = EmojiStream 26 class EmojiStream extends Readable { } // emoji-stream.js (custom readable stream) 1 const { EMOJI_MAP } = require('emoji') // from npm 2 const { Readable } = require('readable-stream') // from npm 3 const emojis = Object.keys(EMOJI_MAP) 4 function getEmojiDescription (index) { 5 return EMOJI_MAP[emojis[index]][1] 6 } 7 function getMessage (index) { 8 return emojis[index] + ' ' + getEmojiDescription(index) 9 } 10 11 12 constructor (options) { 13 super(options) 14 this._index = 0 15 } 16 17 _read () { 18 if (this._index >= emojis.length) { 19 return this.push(null) 20 } 21 return this.push(getMessage(this._index++)) 22 } 23 24 25 module.exports = EmojiStream 26 _read () { } // emoji-stream.js (custom readable stream) 1 const { EMOJI_MAP } = require('emoji') // from npm 2 const { Readable } = require('readable-stream') // from npm 3 const emojis = Object.keys(EMOJI_MAP) 4 function getEmojiDescription (index) { 5 return EMOJI_MAP[emojis[index]][1] 6 } 7 function getMessage (index) { 8 return emojis[index] + ' ' + getEmojiDescription(index) 9 } 10 11 class EmojiStream extends Readable { 12 constructor (options) { 13 super(options) 14 this._index = 0 15 } 16 17 18 if (this._index >= emojis.length) { 19 return this.push(null) 20 } 21 return this.push(getMessage(this._index++)) 22 23 } 24 25 module.exports = EmojiStream 26 if (this._index >= emojis.length) { } // emoji-stream.js (custom readable stream) 1 const { EMOJI_MAP } = require('emoji') // from npm 2 const { Readable } = require('readable-stream') // from npm 3 const emojis = Object.keys(EMOJI_MAP) 4 function getEmojiDescription (index) { 5 return EMOJI_MAP[emojis[index]][1] 6 } 7 function getMessage (index) { 8 return emojis[index] + ' ' + getEmojiDescription(index) 9 } 10 11 class EmojiStream extends Readable { 12 constructor (options) { 13 super(options) 14 this._index = 0 15 } 16 17 _read () { 18 19 return this.push(null) 20 21 return this.push(getMessage(this._index++)) 22 } 23 } 24 25 module.exports = EmojiStream 26 @loige 68
  160. // emoji-stream.js (custom readable stream) const { EMOJI_MAP } =

    require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 const { Readable } = require('readable-stream') // from npm // emoji-stream.js (custom readable stream) 1 const { EMOJI_MAP } = require('emoji') // from npm 2 3 const emojis = Object.keys(EMOJI_MAP) 4 function getEmojiDescription (index) { 5 return EMOJI_MAP[emojis[index]][1] 6 } 7 function getMessage (index) { 8 return emojis[index] + ' ' + getEmojiDescription(index) 9 } 10 11 class EmojiStream extends Readable { 12 constructor (options) { 13 super(options) 14 this._index = 0 15 } 16 17 _read () { 18 if (this._index >= emojis.length) { 19 return this.push(null) 20 } 21 return this.push(getMessage(this._index++)) 22 } 23 } 24 25 module.exports = EmojiStream 26 class EmojiStream extends Readable { } // emoji-stream.js (custom readable stream) 1 const { EMOJI_MAP } = require('emoji') // from npm 2 const { Readable } = require('readable-stream') // from npm 3 const emojis = Object.keys(EMOJI_MAP) 4 function getEmojiDescription (index) { 5 return EMOJI_MAP[emojis[index]][1] 6 } 7 function getMessage (index) { 8 return emojis[index] + ' ' + getEmojiDescription(index) 9 } 10 11 12 constructor (options) { 13 super(options) 14 this._index = 0 15 } 16 17 _read () { 18 if (this._index >= emojis.length) { 19 return this.push(null) 20 } 21 return this.push(getMessage(this._index++)) 22 } 23 24 25 module.exports = EmojiStream 26 _read () { } // emoji-stream.js (custom readable stream) 1 const { EMOJI_MAP } = require('emoji') // from npm 2 const { Readable } = require('readable-stream') // from npm 3 const emojis = Object.keys(EMOJI_MAP) 4 function getEmojiDescription (index) { 5 return EMOJI_MAP[emojis[index]][1] 6 } 7 function getMessage (index) { 8 return emojis[index] + ' ' + getEmojiDescription(index) 9 } 10 11 class EmojiStream extends Readable { 12 constructor (options) { 13 super(options) 14 this._index = 0 15 } 16 17 18 if (this._index >= emojis.length) { 19 return this.push(null) 20 } 21 return this.push(getMessage(this._index++)) 22 23 } 24 25 module.exports = EmojiStream 26 if (this._index >= emojis.length) { } // emoji-stream.js (custom readable stream) 1 const { EMOJI_MAP } = require('emoji') // from npm 2 const { Readable } = require('readable-stream') // from npm 3 const emojis = Object.keys(EMOJI_MAP) 4 function getEmojiDescription (index) { 5 return EMOJI_MAP[emojis[index]][1] 6 } 7 function getMessage (index) { 8 return emojis[index] + ' ' + getEmojiDescription(index) 9 } 10 11 class EmojiStream extends Readable { 12 constructor (options) { 13 super(options) 14 this._index = 0 15 } 16 17 _read () { 18 19 return this.push(null) 20 21 return this.push(getMessage(this._index++)) 22 } 23 } 24 25 module.exports = EmojiStream 26 return this.push(null) // emoji-stream.js (custom readable stream) 1 const { EMOJI_MAP } = require('emoji') // from npm 2 const { Readable } = require('readable-stream') // from npm 3 const emojis = Object.keys(EMOJI_MAP) 4 function getEmojiDescription (index) { 5 return EMOJI_MAP[emojis[index]][1] 6 } 7 function getMessage (index) { 8 return emojis[index] + ' ' + getEmojiDescription(index) 9 } 10 11 class EmojiStream extends Readable { 12 constructor (options) { 13 super(options) 14 this._index = 0 15 } 16 17 _read () { 18 if (this._index >= emojis.length) { 19 20 } 21 return this.push(getMessage(this._index++)) 22 } 23 } 24 25 module.exports = EmojiStream 26 @loige 68
  161. // emoji-stream.js (custom readable stream) const { EMOJI_MAP } =

    require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 const { Readable } = require('readable-stream') // from npm // emoji-stream.js (custom readable stream) 1 const { EMOJI_MAP } = require('emoji') // from npm 2 3 const emojis = Object.keys(EMOJI_MAP) 4 function getEmojiDescription (index) { 5 return EMOJI_MAP[emojis[index]][1] 6 } 7 function getMessage (index) { 8 return emojis[index] + ' ' + getEmojiDescription(index) 9 } 10 11 class EmojiStream extends Readable { 12 constructor (options) { 13 super(options) 14 this._index = 0 15 } 16 17 _read () { 18 if (this._index >= emojis.length) { 19 return this.push(null) 20 } 21 return this.push(getMessage(this._index++)) 22 } 23 } 24 25 module.exports = EmojiStream 26 class EmojiStream extends Readable { } // emoji-stream.js (custom readable stream) 1 const { EMOJI_MAP } = require('emoji') // from npm 2 const { Readable } = require('readable-stream') // from npm 3 const emojis = Object.keys(EMOJI_MAP) 4 function getEmojiDescription (index) { 5 return EMOJI_MAP[emojis[index]][1] 6 } 7 function getMessage (index) { 8 return emojis[index] + ' ' + getEmojiDescription(index) 9 } 10 11 12 constructor (options) { 13 super(options) 14 this._index = 0 15 } 16 17 _read () { 18 if (this._index >= emojis.length) { 19 return this.push(null) 20 } 21 return this.push(getMessage(this._index++)) 22 } 23 24 25 module.exports = EmojiStream 26 _read () { } // emoji-stream.js (custom readable stream) 1 const { EMOJI_MAP } = require('emoji') // from npm 2 const { Readable } = require('readable-stream') // from npm 3 const emojis = Object.keys(EMOJI_MAP) 4 function getEmojiDescription (index) { 5 return EMOJI_MAP[emojis[index]][1] 6 } 7 function getMessage (index) { 8 return emojis[index] + ' ' + getEmojiDescription(index) 9 } 10 11 class EmojiStream extends Readable { 12 constructor (options) { 13 super(options) 14 this._index = 0 15 } 16 17 18 if (this._index >= emojis.length) { 19 return this.push(null) 20 } 21 return this.push(getMessage(this._index++)) 22 23 } 24 25 module.exports = EmojiStream 26 if (this._index >= emojis.length) { } // emoji-stream.js (custom readable stream) 1 const { EMOJI_MAP } = require('emoji') // from npm 2 const { Readable } = require('readable-stream') // from npm 3 const emojis = Object.keys(EMOJI_MAP) 4 function getEmojiDescription (index) { 5 return EMOJI_MAP[emojis[index]][1] 6 } 7 function getMessage (index) { 8 return emojis[index] + ' ' + getEmojiDescription(index) 9 } 10 11 class EmojiStream extends Readable { 12 constructor (options) { 13 super(options) 14 this._index = 0 15 } 16 17 _read () { 18 19 return this.push(null) 20 21 return this.push(getMessage(this._index++)) 22 } 23 } 24 25 module.exports = EmojiStream 26 return this.push(null) // emoji-stream.js (custom readable stream) 1 const { EMOJI_MAP } = require('emoji') // from npm 2 const { Readable } = require('readable-stream') // from npm 3 const emojis = Object.keys(EMOJI_MAP) 4 function getEmojiDescription (index) { 5 return EMOJI_MAP[emojis[index]][1] 6 } 7 function getMessage (index) { 8 return emojis[index] + ' ' + getEmojiDescription(index) 9 } 10 11 class EmojiStream extends Readable { 12 constructor (options) { 13 super(options) 14 this._index = 0 15 } 16 17 _read () { 18 if (this._index >= emojis.length) { 19 20 } 21 return this.push(getMessage(this._index++)) 22 } 23 } 24 25 module.exports = EmojiStream 26 return this.push(getMessage(this._index++)) // emoji-stream.js (custom readable stream) 1 const { EMOJI_MAP } = require('emoji') // from npm 2 const { Readable } = require('readable-stream') // from npm 3 const emojis = Object.keys(EMOJI_MAP) 4 function getEmojiDescription (index) { 5 return EMOJI_MAP[emojis[index]][1] 6 } 7 function getMessage (index) { 8 return emojis[index] + ' ' + getEmojiDescription(index) 9 } 10 11 class EmojiStream extends Readable { 12 constructor (options) { 13 super(options) 14 this._index = 0 15 } 16 17 _read () { 18 if (this._index >= emojis.length) { 19 return this.push(null) 20 } 21 22 } 23 } 24 25 module.exports = EmojiStream 26 @loige 68
  162. // uppercasify.js (custom transform stream) const { Transform } =

    require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } } module.exports = Uppercasify 1 2 3 4 5 6 7 8 9 10 11 12 @loige 69
  163. // uppercasify.js (custom transform stream) const { Transform } =

    require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } } module.exports = Uppercasify 1 2 3 4 5 6 7 8 9 10 11 12 const { Transform } = require('readable-stream') // uppercasify.js (custom transform stream) 1 2 3 4 class Uppercasify extends Transform { 5 _transform (chunk, encoding, done) { 6 this.push(chunk.toString().toUpperCase()) 7 done() 8 } 9 } 10 11 module.exports = Uppercasify 12 @loige 69
  164. // uppercasify.js (custom transform stream) const { Transform } =

    require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } } module.exports = Uppercasify 1 2 3 4 5 6 7 8 9 10 11 12 const { Transform } = require('readable-stream') // uppercasify.js (custom transform stream) 1 2 3 4 class Uppercasify extends Transform { 5 _transform (chunk, encoding, done) { 6 this.push(chunk.toString().toUpperCase()) 7 done() 8 } 9 } 10 11 module.exports = Uppercasify 12 class Uppercasify extends Transform { } // uppercasify.js (custom transform stream) 1 2 const { Transform } = require('readable-stream') 3 4 5 _transform (chunk, encoding, done) { 6 this.push(chunk.toString().toUpperCase()) 7 done() 8 } 9 10 11 module.exports = Uppercasify 12 @loige 69
  165. // uppercasify.js (custom transform stream) const { Transform } =

    require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } } module.exports = Uppercasify 1 2 3 4 5 6 7 8 9 10 11 12 const { Transform } = require('readable-stream') // uppercasify.js (custom transform stream) 1 2 3 4 class Uppercasify extends Transform { 5 _transform (chunk, encoding, done) { 6 this.push(chunk.toString().toUpperCase()) 7 done() 8 } 9 } 10 11 module.exports = Uppercasify 12 class Uppercasify extends Transform { } // uppercasify.js (custom transform stream) 1 2 const { Transform } = require('readable-stream') 3 4 5 _transform (chunk, encoding, done) { 6 this.push(chunk.toString().toUpperCase()) 7 done() 8 } 9 10 11 module.exports = Uppercasify 12 _transform (chunk, encoding, done) { } // uppercasify.js (custom transform stream) 1 2 const { Transform } = require('readable-stream') 3 4 class Uppercasify extends Transform { 5 6 this.push(chunk.toString().toUpperCase()) 7 done() 8 9 } 10 11 module.exports = Uppercasify 12 @loige 69
  166. // uppercasify.js (custom transform stream) const { Transform } =

    require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } } module.exports = Uppercasify 1 2 3 4 5 6 7 8 9 10 11 12 const { Transform } = require('readable-stream') // uppercasify.js (custom transform stream) 1 2 3 4 class Uppercasify extends Transform { 5 _transform (chunk, encoding, done) { 6 this.push(chunk.toString().toUpperCase()) 7 done() 8 } 9 } 10 11 module.exports = Uppercasify 12 class Uppercasify extends Transform { } // uppercasify.js (custom transform stream) 1 2 const { Transform } = require('readable-stream') 3 4 5 _transform (chunk, encoding, done) { 6 this.push(chunk.toString().toUpperCase()) 7 done() 8 } 9 10 11 module.exports = Uppercasify 12 _transform (chunk, encoding, done) { } // uppercasify.js (custom transform stream) 1 2 const { Transform } = require('readable-stream') 3 4 class Uppercasify extends Transform { 5 6 this.push(chunk.toString().toUpperCase()) 7 done() 8 9 } 10 11 module.exports = Uppercasify 12 this.push(chunk.toString().toUpperCase()) // uppercasify.js (custom transform stream) 1 2 const { Transform } = require('readable-stream') 3 4 class Uppercasify extends Transform { 5 _transform (chunk, encoding, done) { 6 7 done() 8 } 9 } 10 11 module.exports = Uppercasify 12 @loige 69
  167. // uppercasify.js (custom transform stream) const { Transform } =

    require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } } module.exports = Uppercasify 1 2 3 4 5 6 7 8 9 10 11 12 const { Transform } = require('readable-stream') // uppercasify.js (custom transform stream) 1 2 3 4 class Uppercasify extends Transform { 5 _transform (chunk, encoding, done) { 6 this.push(chunk.toString().toUpperCase()) 7 done() 8 } 9 } 10 11 module.exports = Uppercasify 12 class Uppercasify extends Transform { } // uppercasify.js (custom transform stream) 1 2 const { Transform } = require('readable-stream') 3 4 5 _transform (chunk, encoding, done) { 6 this.push(chunk.toString().toUpperCase()) 7 done() 8 } 9 10 11 module.exports = Uppercasify 12 _transform (chunk, encoding, done) { } // uppercasify.js (custom transform stream) 1 2 const { Transform } = require('readable-stream') 3 4 class Uppercasify extends Transform { 5 6 this.push(chunk.toString().toUpperCase()) 7 done() 8 9 } 10 11 module.exports = Uppercasify 12 this.push(chunk.toString().toUpperCase()) // uppercasify.js (custom transform stream) 1 2 const { Transform } = require('readable-stream') 3 4 class Uppercasify extends Transform { 5 _transform (chunk, encoding, done) { 6 7 done() 8 } 9 } 10 11 module.exports = Uppercasify 12 done() // uppercasify.js (custom transform stream) 1 2 const { Transform } = require('readable-stream') 3 4 class Uppercasify extends Transform { 5 _transform (chunk, encoding, done) { 6 this.push(chunk.toString().toUpperCase()) 7 8 } 9 } 10 11 module.exports = Uppercasify 12 @loige 69
  168. // dom-append.js (custom writable stream) const { Writable } =

    require('readable-stream') class DOMAppend extends Writable { _write (chunk, encoding, done) { const elem = document.createElement('li') const content = document.createTextNode(chunk.toString()) elem.appendChild(content) document.getElementById('list').appendChild(elem) done() } } module.exports = DOMAppend 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 @loige 70
  169. // dom-append.js (custom writable stream) const { Writable } =

    require('readable-stream') class DOMAppend extends Writable { _write (chunk, encoding, done) { const elem = document.createElement('li') const content = document.createTextNode(chunk.toString()) elem.appendChild(content) document.getElementById('list').appendChild(elem) done() } } module.exports = DOMAppend 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 const { Writable } = require('readable-stream') // dom-append.js (custom writable stream) 1 2 3 4 class DOMAppend extends Writable { 5 6 _write (chunk, encoding, done) { 7 const elem = document.createElement('li') 8 const content = document.createTextNode(chunk.toString()) 9 elem.appendChild(content) 10 document.getElementById('list').appendChild(elem) 11 done() 12 } 13 } 14 15 module.exports = DOMAppend 16 @loige 70
  170. // dom-append.js (custom writable stream) const { Writable } =

    require('readable-stream') class DOMAppend extends Writable { _write (chunk, encoding, done) { const elem = document.createElement('li') const content = document.createTextNode(chunk.toString()) elem.appendChild(content) document.getElementById('list').appendChild(elem) done() } } module.exports = DOMAppend 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 const { Writable } = require('readable-stream') // dom-append.js (custom writable stream) 1 2 3 4 class DOMAppend extends Writable { 5 6 _write (chunk, encoding, done) { 7 const elem = document.createElement('li') 8 const content = document.createTextNode(chunk.toString()) 9 elem.appendChild(content) 10 document.getElementById('list').appendChild(elem) 11 done() 12 } 13 } 14 15 module.exports = DOMAppend 16 class DOMAppend extends Writable { } // dom-append.js (custom writable stream) 1 2 const { Writable } = require('readable-stream') 3 4 5 6 _write (chunk, encoding, done) { 7 const elem = document.createElement('li') 8 const content = document.createTextNode(chunk.toString()) 9 elem.appendChild(content) 10 document.getElementById('list').appendChild(elem) 11 done() 12 } 13 14 15 module.exports = DOMAppend 16 @loige 70
  171. // dom-append.js (custom writable stream) const { Writable } =

    require('readable-stream') class DOMAppend extends Writable { _write (chunk, encoding, done) { const elem = document.createElement('li') const content = document.createTextNode(chunk.toString()) elem.appendChild(content) document.getElementById('list').appendChild(elem) done() } } module.exports = DOMAppend 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 const { Writable } = require('readable-stream') // dom-append.js (custom writable stream) 1 2 3 4 class DOMAppend extends Writable { 5 6 _write (chunk, encoding, done) { 7 const elem = document.createElement('li') 8 const content = document.createTextNode(chunk.toString()) 9 elem.appendChild(content) 10 document.getElementById('list').appendChild(elem) 11 done() 12 } 13 } 14 15 module.exports = DOMAppend 16 class DOMAppend extends Writable { } // dom-append.js (custom writable stream) 1 2 const { Writable } = require('readable-stream') 3 4 5 6 _write (chunk, encoding, done) { 7 const elem = document.createElement('li') 8 const content = document.createTextNode(chunk.toString()) 9 elem.appendChild(content) 10 document.getElementById('list').appendChild(elem) 11 done() 12 } 13 14 15 module.exports = DOMAppend 16 _write (chunk, encoding, done) { } // dom-append.js (custom writable stream) 1 2 const { Writable } = require('readable-stream') 3 4 class DOMAppend extends Writable { 5 6 7 const elem = document.createElement('li') 8 const content = document.createTextNode(chunk.toString()) 9 elem.appendChild(content) 10 document.getElementById('list').appendChild(elem) 11 done() 12 13 } 14 15 module.exports = DOMAppend 16 @loige 70
  172. // dom-append.js (custom writable stream) const { Writable } =

    require('readable-stream') class DOMAppend extends Writable { _write (chunk, encoding, done) { const elem = document.createElement('li') const content = document.createTextNode(chunk.toString()) elem.appendChild(content) document.getElementById('list').appendChild(elem) done() } } module.exports = DOMAppend 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 const { Writable } = require('readable-stream') // dom-append.js (custom writable stream) 1 2 3 4 class DOMAppend extends Writable { 5 6 _write (chunk, encoding, done) { 7 const elem = document.createElement('li') 8 const content = document.createTextNode(chunk.toString()) 9 elem.appendChild(content) 10 document.getElementById('list').appendChild(elem) 11 done() 12 } 13 } 14 15 module.exports = DOMAppend 16 class DOMAppend extends Writable { } // dom-append.js (custom writable stream) 1 2 const { Writable } = require('readable-stream') 3 4 5 6 _write (chunk, encoding, done) { 7 const elem = document.createElement('li') 8 const content = document.createTextNode(chunk.toString()) 9 elem.appendChild(content) 10 document.getElementById('list').appendChild(elem) 11 done() 12 } 13 14 15 module.exports = DOMAppend 16 _write (chunk, encoding, done) { } // dom-append.js (custom writable stream) 1 2 const { Writable } = require('readable-stream') 3 4 class DOMAppend extends Writable { 5 6 7 const elem = document.createElement('li') 8 const content = document.createTextNode(chunk.toString()) 9 elem.appendChild(content) 10 document.getElementById('list').appendChild(elem) 11 done() 12 13 } 14 15 module.exports = DOMAppend 16 const elem = document.createElement('li') const content = document.createTextNode(chunk.toString()) elem.appendChild(content) document.getElementById('list').appendChild(elem) // dom-append.js (custom writable stream) 1 2 const { Writable } = require('readable-stream') 3 4 class DOMAppend extends Writable { 5 6 _write (chunk, encoding, done) { 7 8 9 10 11 done() 12 } 13 } 14 15 module.exports = DOMAppend 16 @loige 70
  173. // dom-append.js (custom writable stream) const { Writable } =

    require('readable-stream') class DOMAppend extends Writable { _write (chunk, encoding, done) { const elem = document.createElement('li') const content = document.createTextNode(chunk.toString()) elem.appendChild(content) document.getElementById('list').appendChild(elem) done() } } module.exports = DOMAppend 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 const { Writable } = require('readable-stream') // dom-append.js (custom writable stream) 1 2 3 4 class DOMAppend extends Writable { 5 6 _write (chunk, encoding, done) { 7 const elem = document.createElement('li') 8 const content = document.createTextNode(chunk.toString()) 9 elem.appendChild(content) 10 document.getElementById('list').appendChild(elem) 11 done() 12 } 13 } 14 15 module.exports = DOMAppend 16 class DOMAppend extends Writable { } // dom-append.js (custom writable stream) 1 2 const { Writable } = require('readable-stream') 3 4 5 6 _write (chunk, encoding, done) { 7 const elem = document.createElement('li') 8 const content = document.createTextNode(chunk.toString()) 9 elem.appendChild(content) 10 document.getElementById('list').appendChild(elem) 11 done() 12 } 13 14 15 module.exports = DOMAppend 16 _write (chunk, encoding, done) { } // dom-append.js (custom writable stream) 1 2 const { Writable } = require('readable-stream') 3 4 class DOMAppend extends Writable { 5 6 7 const elem = document.createElement('li') 8 const content = document.createTextNode(chunk.toString()) 9 elem.appendChild(content) 10 document.getElementById('list').appendChild(elem) 11 done() 12 13 } 14 15 module.exports = DOMAppend 16 const elem = document.createElement('li') const content = document.createTextNode(chunk.toString()) elem.appendChild(content) document.getElementById('list').appendChild(elem) // dom-append.js (custom writable stream) 1 2 const { Writable } = require('readable-stream') 3 4 class DOMAppend extends Writable { 5 6 _write (chunk, encoding, done) { 7 8 9 10 11 done() 12 } 13 } 14 15 module.exports = DOMAppend 16 done() // dom-append.js (custom writable stream) 1 2 const { Writable } = require('readable-stream') 3 4 class DOMAppend extends Writable { 5 6 _write (chunk, encoding, done) { 7 const elem = document.createElement('li') 8 const content = document.createTextNode(chunk.toString()) 9 elem.appendChild(content) 10 document.getElementById('list').appendChild(elem) 11 12 } 13 } 14 15 module.exports = DOMAppend 16 @loige 70
  174. 05. STREAMS IN THE 05. STREAMS IN THE BROWSER BROWSER

    @loige 71
  175. // browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify')

    const DOMAppend = require('../dom-append') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend() emoji .pipe(uppercasify) .pipe(append) 1 2 3 4 5 6 7 8 9 10 11 12 13 @loige 72
  176. // browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify')

    const DOMAppend = require('../dom-append') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend() emoji .pipe(uppercasify) .pipe(append) 1 2 3 4 5 6 7 8 9 10 11 12 13 const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify') const DOMAppend = require('../dom-append') // browser/app.js 1 2 3 4 5 6 const emoji = new EmojiStream() 7 const uppercasify = new Uppercasify() 8 const append = new DOMAppend() 9 10 emoji 11 .pipe(uppercasify) 12 .pipe(append) 13 @loige 72
  177. // browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify')

    const DOMAppend = require('../dom-append') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend() emoji .pipe(uppercasify) .pipe(append) 1 2 3 4 5 6 7 8 9 10 11 12 13 const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify') const DOMAppend = require('../dom-append') // browser/app.js 1 2 3 4 5 6 const emoji = new EmojiStream() 7 const uppercasify = new Uppercasify() 8 const append = new DOMAppend() 9 10 emoji 11 .pipe(uppercasify) 12 .pipe(append) 13 const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend() // browser/app.js 1 2 const EmojiStream = require('../emoji-stream') 3 const Uppercasify = require('../uppercasify') 4 const DOMAppend = require('../dom-append') 5 6 7 8 9 10 emoji 11 .pipe(uppercasify) 12 .pipe(append) 13 @loige 72
  178. // browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify')

    const DOMAppend = require('../dom-append') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend() emoji .pipe(uppercasify) .pipe(append) 1 2 3 4 5 6 7 8 9 10 11 12 13 const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify') const DOMAppend = require('../dom-append') // browser/app.js 1 2 3 4 5 6 const emoji = new EmojiStream() 7 const uppercasify = new Uppercasify() 8 const append = new DOMAppend() 9 10 emoji 11 .pipe(uppercasify) 12 .pipe(append) 13 const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend() // browser/app.js 1 2 const EmojiStream = require('../emoji-stream') 3 const Uppercasify = require('../uppercasify') 4 const DOMAppend = require('../dom-append') 5 6 7 8 9 10 emoji 11 .pipe(uppercasify) 12 .pipe(append) 13 emoji .pipe(uppercasify) .pipe(append) // browser/app.js 1 2 const EmojiStream = require('../emoji-stream') 3 const Uppercasify = require('../uppercasify') 4 const DOMAppend = require('../dom-append') 5 6 const emoji = new EmojiStream() 7 const uppercasify = new Uppercasify() 8 const append = new DOMAppend() 9 10 11 12 13 @loige 72
  179. npm i --save-dev webpack webpack-cli node_modules/.bin/webpack src/browser/app.js # creates dist/main.js

    mv dist/main.js src/browser/app-bundle.js @loige Let's use webpack to build this app for the browser 73
  180. <!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8" /> <meta name="viewport"

    content="width=device-width,initial-scale=1,shrink-to-fit=no" /> <title>Streams in the browser!</title> </head> <body> <ul id="list"></ul> <script src="app.bundle.js"></script> </body> </html> 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 @loige Finally let's create an index.html 74
  181. <!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8" /> <meta name="viewport"

    content="width=device-width,initial-scale=1,shrink-to-fit=no" /> <title>Streams in the browser!</title> </head> <body> <ul id="list"></ul> <script src="app.bundle.js"></script> </body> </html> 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 <ul id="list"></ul> <!DOCTYPE html> 1 <html lang="en"> 2 <head> 3 <meta charset="utf-8" /> 4 <meta 5 name="viewport" 6 content="width=device-width,initial-scale=1,shrink-to-fit=no" 7 /> 8 <title>Streams in the browser!</title> 9 </head> 10 <body> 11 12 <script src="app.bundle.js"></script> 13 </body> 14 </html> 15 @loige Finally let's create an index.html 74
  182. <!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8" /> <meta name="viewport"

    content="width=device-width,initial-scale=1,shrink-to-fit=no" /> <title>Streams in the browser!</title> </head> <body> <ul id="list"></ul> <script src="app.bundle.js"></script> </body> </html> 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 <ul id="list"></ul> <!DOCTYPE html> 1 <html lang="en"> 2 <head> 3 <meta charset="utf-8" /> 4 <meta 5 name="viewport" 6 content="width=device-width,initial-scale=1,shrink-to-fit=no" 7 /> 8 <title>Streams in the browser!</title> 9 </head> 10 <body> 11 12 <script src="app.bundle.js"></script> 13 </body> 14 </html> 15 <script src="app.bundle.js"></script> <!DOCTYPE html> 1 <html lang="en"> 2 <head> 3 <meta charset="utf-8" /> 4 <meta 5 name="viewport" 6 content="width=device-width,initial-scale=1,shrink-to-fit=no" 7 /> 8 <title>Streams in the browser!</title> 9 </head> 10 <body> 11 <ul id="list"></ul> 12 13 </body> 14 </html> 15 @loige Finally let's create an index.html 74
  183. @loige 75

  184. @loige 75

  185. 06. CLOSING 06. CLOSING @loige 76

  186. Streams have low memory footprint Process data as soon as

    it's available Composition through pipelines Streams are abstractions: Readable = Input Transform = Business Logic Writable = Output @loige TLDR; TLDR; 77
  187. IF YOU WANT TO LEARN (EVEN) MOAR IF YOU WANT

    TO LEARN (EVEN) MOAR ABOUT STREAMS... ABOUT STREAMS... nodejs.org/api/stream.html github.com/substack/stream-handbook github.com/lmammino/streams-workshop @loige 78
  188. IF YOU ARE NOT CONVINCED YET... IF YOU ARE NOT

    CONVINCED YET... @loige curl parrot.live 79
  189. @loige github.com/hugomd/parrot.live Check out the codebase 80

  190. @loige 81

  191. @loige THANK YOU THANK YOU! ! loige.link/streams-austin We are hiring,

    talk to me or check out vectra.ai/about/careers 82
  192. CREDITS CREDITS Cover picture by from for the amazing St.

    Patrick emoji art The internet for the memes! :D David Mark Pixabay emojiart.org SPECIAL THANKS SPECIAL THANKS , , , , , @StefanoAbalsamo @mariocasciaro @machine_person @Podgeypoos79 @katavic_d @UrsoLuca @loige 83