$30 off During Our Annual Pro Sale. View Details »

It’s about time to embrace Node.js Streams

It’s about time to embrace Node.js Streams

With very practical examples we'll learn how streams work in Node.js & the Browser. With streams, you will be able to write elegant JavaScript applications that are much more composable and memory efficient! Streams are probably one of the most beautiful features of Node.js, but still largely underestimated and rarely used. Once you'll grasp the fundamentals, you'll be able to solve some ordinary programming challenges in a much more elegant and efficient way. With streams power in your tool belt, you'll be able to write applications that can deal with gigabytes or even terabytes of data efficiently. This talk will cover the following topics: Streams: when and how; Different types of streams; Built-in and custom streams; Composability; Utils & Streams in the browser.

Luciano Mammino

March 23, 2019
Tweet

More Decks by Luciano Mammino

Other Decks in Technology

Transcript

  1. Luciano Mammino (@loige) IT’S ABOUT TIME TO IT’S ABOUT TIME

    TO EMBRACE NODE.JS STREAMS EMBRACE NODE.JS STREAMS loige.link/streams-rome Rome - March 23, 2019 1
  2. HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! Cloud Architect

    Let's connect! Blog: Twitter: GitHub: loige.co @loige @lmammino loige.link/node-patterns fstack.link with @mariocasciaro with @andreaman87 3
  3. AGENDA AGENDA 01. Buffers VS Streams 02. Stream types &

    APIs 03. Pipe() 04. Streams utilities 05. Writing custom streams 06. Streams in the Browser @loige 4
  4. BUFFER BUFFER: DATA STRUCTURE TO STORE AND : DATA STRUCTURE

    TO STORE AND TRANSFER ARBITRARY BINARY DATA TRANSFER ARBITRARY BINARY DATA @loige *Note that this is loading all the content of the file in memory * 6
  5. STREAM STREAM: ABSTRACT INTERFACE FOR : ABSTRACT INTERFACE FOR WORKING

    WITH STREAMING DATA WORKING WITH STREAMING DATA @loige *It does not load all the data straight away * 7
  6. LET'S SOLVE THIS PROBLEM LET'S SOLVE THIS PROBLEM 1. Read

    the content of a file 2. copy it to another file* * cp in Node.js @loige 8
  7. THE BUFFER WAY THE BUFFER WAY // buffer-copy.js const {

    readFileSync, writeFileSync } = require('fs') const [,, src, dest] = process.argv const content = readFileSync(src) writeFileSync(dest, content) @loige 9
  8. THE STREAM WAY THE STREAM WAY // stream-copy.js const {

    createReadStream, createWriteStream } = require('fs') const [,, src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', (data) => destStream.write(data)) @loige * Careful: this implementation is not optimal * 10
  9. MEMORY COMPARISON (~600MB FILE) MEMORY COMPARISON (~600MB FILE) node ­­inspect­brk

    buffer­copy.js assets/poster.psd ~/Downloads/poster.psd @loige 11
  10. MEMORY COMPARISON (~600MB FILE) MEMORY COMPARISON (~600MB FILE) node ­­inspect­brk

    stream­copy.js assets/poster.psd ~/Downloads/poster.psd @loige 12
  11. LET'S TRY WITH A BIG FILE (~10GB) LET'S TRY WITH

    A BIG FILE (~10GB) node ­­inspect­brk stream­copy.js assets/the­matrix­hd.mkv ~/Downloads/the­matrix­hd.mkv @loige 14
  12. STREAMS VS BUFFERS STREAMS VS BUFFERS Streams keep a low

    memory footprint even with large amounts of data Streams allows you to process data as soon as it arrives Stream processing generally does not block the event loop @loige 18
  13. ALL STREAMS ARE ALL STREAMS ARE EVENT EMITTERS EVENT EMITTERS

    A stream instance is an object that emits events when its internal state changes, for instance: s.on('readable', () => {}) // ready to be consumed s.on('data', (chunk) => {}) // new data is available s.on('error', (err) => {}) // some error happened s.on('end', () => {}) // no more data available The events available depend from the type of stream @loige 20
  14. WRITABLE WRITABLE STREAMS STREAMS A writable stream is an abstraction

    that allows to write data over a destination Examples: fs writeStream process.stdout, process.stderr HTTP request (client-side) HTTP response (server-side) AWS S3 PutObject (body parameter) @loige 21
  15. WRITABLE STREAMS - WRITABLE STREAMS - METHODS METHODS writable.write(chunk, [encoding],

    [callback]) writable.end([chunk], [encoding], [callback]) @loige 22
  16. // writable-http-request.js const http = require('http') const req = http.request(

    { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...\n') req.end('last write & close the stream') @loige 24
  17. // writable-http-request.js const http = require('http') const req = http.request(

    { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...\n') req.end('last write & close the stream') @loige 24
  18. // writable-http-request.js const http = require('http') const req = http.request(

    { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...\n') req.end('last write & close the stream') @loige 24
  19. // writable-http-request.js const http = require('http') const req = http.request(

    { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...\n') req.end('last write & close the stream') @loige 24
  20. BACKPRESSURE BACKPRESSURE When writing large amounts of data you should

    make sure you handle the stop write signal and the drain event loige.link/backpressure @loige 27
  21. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const

    [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) @loige 28
  22. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const

    [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) @loige 28
  23. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const

    [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) @loige 28
  24. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const

    [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) @loige 28
  25. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const

    [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) @loige 28
  26. READABLE READABLE STREAMS STREAMS A readable stream represents a source

    from which data is consumed. Examples: fs readStream process.stdin HTTP response (client-side) HTTP request (server-side) AWS S3 GetObject (data field) It supports two modes for data consumption: flowing and paused (or non- flowing) mode. @loige 29
  27. READABLE STREAMS - READABLE STREAMS - EVENTS EVENTS readable.on('readable') readable.on('data',

    (chunk) => {}) readable.on('end') readable.on('error', (err) => {}) @loige 31
  28. READABLE STREAMS - READABLE STREAMS - FLOWING MODE FLOWING MODE

    Data is read from source automatically and chunks are emitted as soon as they are available. @loige 32
  29. @loige 1 2 3 Source data Readable stream in flowing

    mode data listener READABLE STREAMS - READABLE STREAMS - FLOWING MODE FLOWING MODE Data is read from source automatically and chunks are emitted as soon as they are available. 33
  30. @loige 1 2 3 Source data Readable stream in flowing

    mode Read data listener READABLE STREAMS - READABLE STREAMS - FLOWING MODE FLOWING MODE Data is read from source automatically and chunks are emitted as soon as they are available. 34
  31. @loige 1 2 3 Source data Readable stream in flowing

    mode data listener data READABLE STREAMS - READABLE STREAMS - FLOWING MODE FLOWING MODE Data is read from source automatically and chunks are emitted as soon as they are available. 35
  32. @loige 2 3 Source data Readable stream in flowing mode

    data listener Read READABLE STREAMS - READABLE STREAMS - FLOWING MODE FLOWING MODE Data is read from source automatically and chunks are emitted as soon as they are available. 36
  33. @loige 2 3 Source data Readable stream in flowing mode

    data listener data READABLE STREAMS - READABLE STREAMS - FLOWING MODE FLOWING MODE Data is read from source automatically and chunks are emitted as soon as they are available. 37
  34. @loige 3 Source data Readable stream in flowing mode data

    listener Read READABLE STREAMS - READABLE STREAMS - FLOWING MODE FLOWING MODE Data is read from source automatically and chunks are emitted as soon as they are available. 38
  35. @loige 3 Source data Readable stream in flowing mode data

    listener data READABLE STREAMS - READABLE STREAMS - FLOWING MODE FLOWING MODE Data is read from source automatically and chunks are emitted as soon as they are available. 39
  36. @loige Source data Readable stream in flowing mode Read data

    listener (end) READABLE STREAMS - READABLE STREAMS - FLOWING MODE FLOWING MODE Data is read from source automatically and chunks are emitted as soon as they are available. 40
  37. @loige Source data Readable stream in flowing mode data listener

    end (end) When no more data is available, end is emitted. READABLE STREAMS - READABLE STREAMS - FLOWING MODE FLOWING MODE Data is read from source automatically and chunks are emitted as soon as they are available. 41
  38. // count-emojis-flowing.js const { createReadStream } = require('fs') const {

    EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) @loige 42
  39. // count-emojis-flowing.js const { createReadStream } = require('fs') const {

    EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) @loige 42
  40. READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE

    A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. @loige 44
  41. @loige Source data Readable stream in paused mode consumer 1

    2 3 < > (internal buffer) READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 45
  42. @loige Source data consumer 1 2 3 < > (internal

    buffer) Read Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 46
  43. @loige Source data consumer 1 2 3 < > (internal

    buffer) readable Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 47
  44. @loige Source data consumer 1 2 3 < > (internal

    buffer) Nothing happens until the consumer decides to read the data Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 48
  45. < > @loige Source data consumer 1 2 3 (internal

    buffer) read() Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 49
  46. < > @loige Source data consumer 2 3 (internal buffer)

    Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 50
  47. < > @loige Source data consumer null 2 3 (internal

    buffer) read() Using read() with an empty buffer will return null (stop reading signal) Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 51
  48. < > @loige Source data consumer 2 3 (internal buffer)

    Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 52
  49. < > @loige Source data consumer 2 3 (internal buffer)

    Read Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 53
  50. < > @loige Source data consumer 2 3 (internal buffer)

    readable Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 54
  51. < > @loige Source data consumer 2 3 (internal buffer)

    Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 55
  52. < > @loige Source data consumer 2 3 (internal buffer)

    read() Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 56
  53. < > @loige Source data consumer 3 (internal buffer) Readable

    stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 57
  54. < > @loige Source data consumer 3 (internal buffer) Read

    Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 58
  55. < > @loige Source data consumer 3 (internal buffer) readable

    Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 59
  56. < > @loige Source data consumer 3 (internal buffer) Readable

    stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 60
  57. < > @loige Source data consumer 3 (internal buffer) read()

    Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 61
  58. < > @loige Source data consumer (internal buffer) Readable stream

    in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 62
  59. < > @loige Source data consumer (internal buffer) Read (end)

    Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 63
  60. < > @loige Source data consumer (internal buffer) (end) end

    Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 64
  61. // count-emojis-paused.js const { createReadStream } = require('fs') const {

    EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('readable', () => { let chunk while ((chunk = file.read()) !== null) { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) @loige 65
  62. // count-emojis-paused.js const { createReadStream } = require('fs') const {

    EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('readable', () => { let chunk while ((chunk = file.read()) !== null) { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) @loige 65
  63. // count-emojis-paused.js const { createReadStream } = require('fs') const {

    EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('readable', () => { let chunk while ((chunk = file.read()) !== null) { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) @loige 65
  64. READABLE STREAMS READABLE STREAMS MODE SWITCH CONDITIONS MODE SWITCH CONDITIONS

    All readable streams are created in paused mode paused streams can be switched to flowing mode with: stream.on('data', () => {}) stream.resume() stream.pipe() flowing streams can switch back to paused with: stream.pause() stream.unpipe() for all attached streams @loige 67
  65. READABLE STREAMS READABLE STREAMS FLOWING FLOWING VS VS PAUSED PAUSED

    Push VS Pull mental models Flowing is simpler to use Paused gives you more control on how data is consumed from the source Whichever you pick, stay consistent! @loige 68 . 1
  66. BONUS MODE BONUS MODE Readable streams are also Async Iterators

    (Node.js 10+) Warning: still experimental @loige 68 . 2
  67. // count-emojis-async-iterator.js const { createReadStream } = require('fs') const {

    EMOJI_MAP } = require('emoji') // from npm async function main () { const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 for await (let chunk of file) { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } } console.log(`Found ${counter} emojis`) } main() @loige 68 . 3
  68. // count-emojis-async-iterator.js const { createReadStream } = require('fs') const {

    EMOJI_MAP } = require('emoji') // from npm async function main () { const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 for await (let chunk of file) { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } } console.log(`Found ${counter} emojis`) } main() @loige 68 . 3
  69. Still experimental in Node.js 11 If you like this API

    and don't want to rely on an experimental core feature: 2ality.com/2018/04/async-iter-nodejs.html github.com/lorenzofox3/for-await @loige 68 . 4
  70. OBJECT MODE OBJECT MODE Readable streams can emit objects if

    this mode is enabled // readable-timer.js const { Readable } = require('stream') const timerStream = new Readable({ objectMode: true, read () { this.push(new Date()) } }) timerStream.on('data', (currentDate) => { // prints the current second console.log(currentDate.getSeconds()) }) @loige 69
  71. OBJECT MODE OBJECT MODE Readable streams can emit objects if

    this mode is enabled // readable-timer.js const { Readable } = require('stream') const timerStream = new Readable({ objectMode: true, read () { this.push(new Date()) } }) timerStream.on('data', (currentDate) => { // prints the current second console.log(currentDate.getSeconds()) }) @loige 69
  72. OBJECT MODE OBJECT MODE Readable streams can emit objects if

    this mode is enabled // readable-timer.js const { Readable } = require('stream') const timerStream = new Readable({ objectMode: true, read () { this.push(new Date()) } }) timerStream.on('data', (currentDate) => { // prints the current second console.log(currentDate.getSeconds()) }) @loige This is an object 69
  73. OBJECT MODE OBJECT MODE Readable streams can emit objects if

    this mode is enabled // readable-timer.js const { Readable } = require('stream') const timerStream = new Readable({ objectMode: true, read () { this.push(new Date()) } }) timerStream.on('data', (currentDate) => { // prints the current second console.log(currentDate.getSeconds()) }) @loige 69
  74. OTHER TYPES OF STREAM OTHER TYPES OF STREAM Duplex Stream

    streams that are both Readable and Writable. (net.Socket) Transform Stream Duplex streams that can modify or transform the data as it is written and read. (zlib.createGzip(), crypto.createCipheriv()) @loige 70
  75. ANATOMY OF A TRANSFORM STREAM ANATOMY OF A TRANSFORM STREAM

    1. write data transform stream 3. read transformed data 2. transform the data (readable stream) (writable stream) @loige 71
  76. GZIP EXAMPLE GZIP EXAMPLE 1. write data transform stream 3.

    read transformed data 2. transform the data (readable stream) (writable stream) @loige Uncompressed data Compressed data compress gzip.createGzip() 72
  77. gzipStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue)

    { gzipStream.pause() destStream.once('drain', () => { gzipStream.resume() }) } }) gzipStream.on('end', () => { destStream.end() }) // TODO: handle errors! >:) // stream-copy-gzip.js const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = gzipStream.write(data) if (!canContinue) { srcStream.pause() gzipStream.once('drain', () => { srcStream.resume() }) } }) srcStream.on('end', () => { // check if there's buffered data left const remainingData = gzipStream.read() if (remainingData !== null) { destStream.write() } gzipStream.end() }) @loige 73
  78. gzipStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue)

    { gzipStream.pause() destStream.once('drain', () => { gzipStream.resume() }) } }) gzipStream.on('end', () => { destStream.end() }) // TODO: handle errors! >:) // stream-copy-gzip.js const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = gzipStream.write(data) if (!canContinue) { srcStream.pause() gzipStream.once('drain', () => { srcStream.resume() }) } }) srcStream.on('end', () => { // check if there's buffered data left const remainingData = gzipStream.read() if (remainingData !== null) { destStream.write() } gzipStream.end() }) @loige 73
  79. readable .pipe(tranform1) .pipe(transform2) .pipe(transform3) .pipe(writable) readable.pipe(writableDest) @loige Connects a readable

    stream to a writable stream A transform stream can be used as a destination as well It returns the destination stream allowing for a chain of pipes 75
  80. // stream-copy-gzip-pipe.js const { createReadStream, createWriteStream } = require('fs') const

    { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream .pipe(gzipStream) .pipe(destStream) @loige 76
  81. // stream-copy-gzip-pipe.js const { createReadStream, createWriteStream } = require('fs') const

    { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream .pipe(gzipStream) .pipe(destStream) @loige 76
  82. // stream-copy-gzip-pipe.js const { createReadStream, createWriteStream } = require('fs') const

    { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream .pipe(gzipStream) .pipe(destStream) @loige 76
  83. readable .on('error', handleErr) .pipe(decompress) .on('error', handleErr) .pipe(decrypt) .on('error', handleErr) .pipe(convert)

    .on('error', handleErr) .pipe(encrypt) .on('error', handleErr) .pipe(compress) .on('error', handleErr) .pipe(writeToDisk) .on('error', handleErr) Handling errors (correctly) @loige 78
  84. readable .on('error', handleErr) .pipe(decompress) .on('error', handleErr) .pipe(decrypt) .on('error', handleErr) .pipe(convert)

    .on('error', handleErr) .pipe(encrypt) .on('error', handleErr) .pipe(compress) .on('error', handleErr) .pipe(writeToDisk) .on('error', handleErr) Handling errors (correctly) @loige handleErr should end and destroy the streams (it doesn't happen automatically) 78
  85. readable .on('error', handleErr) .pipe(decompress) .on('error', handleErr) .pipe(decrypt) .on('error', handleErr) .pipe(convert)

    .on('error', handleErr) .pipe(encrypt) .on('error', handleErr) .pipe(compress) .on('error', handleErr) .pipe(writeToDisk) .on('error', handleErr) Handling errors (correctly) @loige handleErr should end and destroy the streams (it doesn't happen automatically) 78
  86. // stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const {

    createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) stream.pipeline(...streams, callback) @loige 80
  87. // stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const {

    createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) stream.pipeline(...streams, callback) @loige Can pass multiple streams (they will be piped) 80
  88. // stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const {

    createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) stream.pipeline(...streams, callback) @loige Can pass multiple streams (they will be piped) The last argument is a callback. If invoked with an error, it means the pipeline failed at some point. All the streams are ended and destroyed correctly. 80
  89. // stream-copy-gzip-pump.js const pump = require('pump') // from npm const

    { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pump( // just swap pipeline with pump! createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) stream.pipeline is available in Node.js 10+ In older systems you can use pump - npm.im/pump @loige 81
  90. // stream-copy-gzip-pump.js const pump = require('pump') // from npm const

    { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pump( // just swap pipeline with pump! createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) stream.pipeline is available in Node.js 10+ In older systems you can use pump - npm.im/pump @loige 81
  91. pumpify(...streams) - Create reusable pieces of pipeline npm.im/pumpify @loige Let's

    create EncGz, an application that helps us to read and write encrypted- gzipped files 82
  92. // encgz-stream.js - utility library const { createCipheriv, createDecipheriv, randomBytes,

    createHash } = require('crypto') const { createGzip, createGunzip } = require('zlib') const pumpify = require('pumpify') // from npm // calculates md5 of the secret (trimmed) function getChiperKey (secret) {} function createEncgz (secret) { const initVect = randomBytes(16) const cipherKey = getChiperKey(secret) const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() const stream = pumpify(encryptStream, gzipStream) stream.initVect = initVect return stream } @loige 83
  93. // encgz-stream.js - utility library const { createCipheriv, createDecipheriv, randomBytes,

    createHash } = require('crypto') const { createGzip, createGunzip } = require('zlib') const pumpify = require('pumpify') // from npm // calculates md5 of the secret (trimmed) function getChiperKey (secret) {} function createEncgz (secret) { const initVect = randomBytes(16) const cipherKey = getChiperKey(secret) const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() const stream = pumpify(encryptStream, gzipStream) stream.initVect = initVect return stream } @loige 83
  94. // encgz-stream.js - utility library const { createCipheriv, createDecipheriv, randomBytes,

    createHash } = require('crypto') const { createGzip, createGunzip } = require('zlib') const pumpify = require('pumpify') // from npm // calculates md5 of the secret (trimmed) function getChiperKey (secret) {} function createEncgz (secret) { const initVect = randomBytes(16) const cipherKey = getChiperKey(secret) const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() const stream = pumpify(encryptStream, gzipStream) stream.initVect = initVect return stream } @loige 83
  95. // encgz-stream.js - utility library const { createCipheriv, createDecipheriv, randomBytes,

    createHash } = require('crypto') const { createGzip, createGunzip } = require('zlib') const pumpify = require('pumpify') // from npm // calculates md5 of the secret (trimmed) function getChiperKey (secret) {} function createEncgz (secret) { const initVect = randomBytes(16) const cipherKey = getChiperKey(secret) const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() const stream = pumpify(encryptStream, gzipStream) stream.initVect = initVect return stream } @loige 83
  96. // encgz-stream.js (...continue from previous slide) function createDecgz (secret, initVect)

    { const cipherKey = getChiperKey(secret) const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() const stream = pumpify(gunzipStream, decryptStream) return stream } module.exports = { createEncgz, createDecgz } @loige 84
  97. // encgz-stream.js (...continue from previous slide) function createDecgz (secret, initVect)

    { const cipherKey = getChiperKey(secret) const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() const stream = pumpify(gunzipStream, decryptStream) return stream } module.exports = { createEncgz, createDecgz } @loige 84
  98. // encgz-stream.js (...continue from previous slide) function createDecgz (secret, initVect)

    { const cipherKey = getChiperKey(secret) const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() const stream = pumpify(gunzipStream, decryptStream) return stream } module.exports = { createEncgz, createDecgz } @loige 84
  99. // encgz-stream.js (...continue from previous slide) function createDecgz (secret, initVect)

    { const cipherKey = getChiperKey(secret) const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() const stream = pumpify(gunzipStream, decryptStream) return stream } module.exports = { createEncgz, createDecgz } @loige 84
  100. // encgz-stream.js (...continue from previous slide) function createDecgz (secret, initVect)

    { const cipherKey = getChiperKey(secret) const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() const stream = pumpify(gunzipStream, decryptStream) return stream } module.exports = { createEncgz, createDecgz } @loige 84
  101. // encgz.js - CLI to encrypt and gzip (from stdin

    to stdout) const { pipeline } = require('stream') const { createEncgz } = require('./encgz-stream') const [, , secret] = process.argv const encgz = createEncgz(secret) console.error(`init vector: ${encgz.initVect.toString('hex')}`) pipeline( process.stdin, encgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) @loige 85
  102. // encgz.js - CLI to encrypt and gzip (from stdin

    to stdout) const { pipeline } = require('stream') const { createEncgz } = require('./encgz-stream') const [, , secret] = process.argv const encgz = createEncgz(secret) console.error(`init vector: ${encgz.initVect.toString('hex')}`) pipeline( process.stdin, encgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) @loige 85
  103. // encgz.js - CLI to encrypt and gzip (from stdin

    to stdout) const { pipeline } = require('stream') const { createEncgz } = require('./encgz-stream') const [, , secret] = process.argv const encgz = createEncgz(secret) console.error(`init vector: ${encgz.initVect.toString('hex')}`) pipeline( process.stdin, encgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) @loige 85
  104. // encgz.js - CLI to encrypt and gzip (from stdin

    to stdout) const { pipeline } = require('stream') const { createEncgz } = require('./encgz-stream') const [, , secret] = process.argv const encgz = createEncgz(secret) console.error(`init vector: ${encgz.initVect.toString('hex')}`) pipeline( process.stdin, encgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) @loige 85
  105. // decgz.js - CLI to gunzip and decrypt (from stdin

    to stdout) const { pipeline } = require('stream') const { createDecgz } = require('./encgz-stream') const [, , secret, initVect] = process.argv const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) pipeline( process.stdin, decgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) @loige 86
  106. // decgz.js - CLI to gunzip and decrypt (from stdin

    to stdout) const { pipeline } = require('stream') const { createDecgz } = require('./encgz-stream') const [, , secret, initVect] = process.argv const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) pipeline( process.stdin, decgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) @loige 86
  107. // decgz.js - CLI to gunzip and decrypt (from stdin

    to stdout) const { pipeline } = require('stream') const { createDecgz } = require('./encgz-stream') const [, , secret, initVect] = process.argv const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) pipeline( process.stdin, decgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) @loige 86
  108. // decgz.js - CLI to gunzip and decrypt (from stdin

    to stdout) const { pipeline } = require('stream') const { createDecgz } = require('./encgz-stream') const [, , secret, initVect] = process.argv const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) pipeline( process.stdin, decgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) @loige 86
  109. stream.finished(streams, callback) Get notified when a stream is not readable,

    writable anymore - Node.js 10+ // finished can be promisified! const finished = util.promisify(stream.finished) const rs = fs.createReadStream('archive.tar') async function run() { await finished(rs) console.log('Stream is done reading.') } run().catch(console.error) rs.resume() // start & drain the stream @loige 88
  110. stream.finished(streams, callback) Get notified when a stream is not readable,

    writable anymore - Node.js 10+ // finished can be promisified! const finished = util.promisify(stream.finished) const rs = fs.createReadStream('archive.tar') async function run() { await finished(rs) console.log('Stream is done reading.') } run().catch(console.error) rs.resume() // start & drain the stream @loige 88
  111. stream.finished(streams, callback) Get notified when a stream is not readable,

    writable anymore - Node.js 10+ // finished can be promisified! const finished = util.promisify(stream.finished) const rs = fs.createReadStream('archive.tar') async function run() { await finished(rs) console.log('Stream is done reading.') } run().catch(console.error) rs.resume() // start & drain the stream @loige 88
  112. readable-stream - Npm package that contains the latest version of

    Node.js stream library. It also makes Node.js streams compatible with the browser (can be used with Webpack and Broswserify) npm.im/readable-stream @loige * yeah, the name is misleading. The package offers all the functionalities in the official 'stream' package, not just readable streams. * 89
  113. // emoji-stream.js (custom readable stream) const { EMOJI_MAP } =

    require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 91
  114. // emoji-stream.js (custom readable stream) const { EMOJI_MAP } =

    require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 91
  115. // emoji-stream.js (custom readable stream) const { EMOJI_MAP } =

    require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 91
  116. // emoji-stream.js (custom readable stream) const { EMOJI_MAP } =

    require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 91
  117. // emoji-stream.js (custom readable stream) const { EMOJI_MAP } =

    require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 91
  118. // emoji-stream.js (custom readable stream) const { EMOJI_MAP } =

    require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 91
  119. // uppercasify.js (custom transform stream) const { Transform } =

    require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } _flush () { // in case there's buffered data // that still have to be pushed } } module.exports = Uppercasify @loige 92
  120. // uppercasify.js (custom transform stream) const { Transform } =

    require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } _flush () { // in case there's buffered data // that still have to be pushed } } module.exports = Uppercasify @loige 92
  121. // uppercasify.js (custom transform stream) const { Transform } =

    require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } _flush () { // in case there's buffered data // that still have to be pushed } } module.exports = Uppercasify @loige 92
  122. // uppercasify.js (custom transform stream) const { Transform } =

    require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } _flush () { // in case there's buffered data // that still have to be pushed } } module.exports = Uppercasify @loige 92
  123. // uppercasify.js (custom transform stream) const { Transform } =

    require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } _flush () { // in case there's buffered data // that still have to be pushed } } module.exports = Uppercasify @loige 92
  124. // uppercasify.js (custom transform stream) const { Transform } =

    require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } _flush () { // in case there's buffered data // that still have to be pushed } } module.exports = Uppercasify @loige 92
  125. // uppercasify.js (custom transform stream) const { Transform } =

    require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } _flush () { // in case there's buffered data // that still have to be pushed } } module.exports = Uppercasify @loige 92
  126. // dom-append.js (custom writable stream) const { Writable } =

    require('readable-stream') class DOMAppend extends Writable { constructor (target, tag = 'p', options) { super(options) this._target = target this._tag = tag } _write (chunk, encoding, done) { const elem = document.createElement(this._tag) const content = document.createTextNode(chunk.toString()) elem.appendChild(content) this._target.appendChild(elem) done() } } module.exports = DOMAppend @loige 93
  127. // dom-append.js (custom writable stream) const { Writable } =

    require('readable-stream') class DOMAppend extends Writable { constructor (target, tag = 'p', options) { super(options) this._target = target this._tag = tag } _write (chunk, encoding, done) { const elem = document.createElement(this._tag) const content = document.createTextNode(chunk.toString()) elem.appendChild(content) this._target.appendChild(elem) done() } } module.exports = DOMAppend @loige 93
  128. // dom-append.js (custom writable stream) const { Writable } =

    require('readable-stream') class DOMAppend extends Writable { constructor (target, tag = 'p', options) { super(options) this._target = target this._tag = tag } _write (chunk, encoding, done) { const elem = document.createElement(this._tag) const content = document.createTextNode(chunk.toString()) elem.appendChild(content) this._target.appendChild(elem) done() } } module.exports = DOMAppend @loige 93
  129. // dom-append.js (custom writable stream) const { Writable } =

    require('readable-stream') class DOMAppend extends Writable { constructor (target, tag = 'p', options) { super(options) this._target = target this._tag = tag } _write (chunk, encoding, done) { const elem = document.createElement(this._tag) const content = document.createTextNode(chunk.toString()) elem.appendChild(content) this._target.appendChild(elem) done() } } module.exports = DOMAppend @loige 93
  130. // dom-append.js (custom writable stream) const { Writable } =

    require('readable-stream') class DOMAppend extends Writable { constructor (target, tag = 'p', options) { super(options) this._target = target this._tag = tag } _write (chunk, encoding, done) { const elem = document.createElement(this._tag) const content = document.createTextNode(chunk.toString()) elem.appendChild(content) this._target.appendChild(elem) done() } } module.exports = DOMAppend @loige 93
  131. // dom-append.js (custom writable stream) const { Writable } =

    require('readable-stream') class DOMAppend extends Writable { constructor (target, tag = 'p', options) { super(options) this._target = target this._tag = tag } _write (chunk, encoding, done) { const elem = document.createElement(this._tag) const content = document.createTextNode(chunk.toString()) elem.appendChild(content) this._target.appendChild(elem) done() } } module.exports = DOMAppend @loige 93
  132. // dom-append.js (custom writable stream) const { Writable } =

    require('readable-stream') class DOMAppend extends Writable { constructor (target, tag = 'p', options) { super(options) this._target = target this._tag = tag } _write (chunk, encoding, done) { const elem = document.createElement(this._tag) const content = document.createTextNode(chunk.toString()) elem.appendChild(content) this._target.appendChild(elem) done() } } module.exports = DOMAppend @loige 93
  133. // browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify')

    const DOMAppend = require('../dom-append') const list = document.getElementById('list') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend(list, 'li') emoji .pipe(uppercasify) .pipe(append) @loige 95
  134. // browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify')

    const DOMAppend = require('../dom-append') const list = document.getElementById('list') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend(list, 'li') emoji .pipe(uppercasify) .pipe(append) @loige 95
  135. // browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify')

    const DOMAppend = require('../dom-append') const list = document.getElementById('list') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend(list, 'li') emoji .pipe(uppercasify) .pipe(append) @loige 95
  136. // browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify')

    const DOMAppend = require('../dom-append') const list = document.getElementById('list') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend(list, 'li') emoji .pipe(uppercasify) .pipe(append) @loige 95
  137. // browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify')

    const DOMAppend = require('../dom-append') const list = document.getElementById('list') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend(list, 'li') emoji .pipe(uppercasify) .pipe(append) @loige 95
  138. npm i --save-dev webpack webpack-cli node_modules/.bin/webpack src/browser/app.js # creates dist/main.js

    mv dist/main.js src/browser/app-bundle.js @loige Let's use webpack to build this app for the browser 96
  139. <!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8" /> <meta name="viewport"

    content="width=device-width, initial-scale=1, shrink-to-fit=no" /> <title>Streams in the browser!</title> </head> <body> <ul id="list"></ul> <script src="app.bundle.js"></script> </body> </html> @loige Finally let's create an index.html 97
  140. <!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8" /> <meta name="viewport"

    content="width=device-width, initial-scale=1, shrink-to-fit=no" /> <title>Streams in the browser!</title> </head> <body> <ul id="list"></ul> <script src="app.bundle.js"></script> </body> </html> @loige Finally let's create an index.html 97
  141. <!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8" /> <meta name="viewport"

    content="width=device-width, initial-scale=1, shrink-to-fit=no" /> <title>Streams in the browser!</title> </head> <body> <ul id="list"></ul> <script src="app.bundle.js"></script> </body> </html> @loige Finally let's create an index.html 97
  142. Streams have low memory footprint Process data as soon as

    it's available Composition through pipelines Define your logic as transform streams Readable and writable streams are good abstractions for input & output You can swap them as needed without having to change the logic @loige TLDR; TLDR; 100
  143. IF YOU WANT TO LEARN (EVEN) MOAR IF YOU WANT

    TO LEARN (EVEN) MOAR ABOUT STREAMS... ABOUT STREAMS... nodejs.org/api/stream.html github.com/substack/stream-handbook @loige 101
  144. @loige GRAZIE! GRAZIE! loige.link/streams-rome PLEASE VOTE THIS TALK :) PLEASE

    VOTE THIS TALK :) We are hiring, talk to me! :) 103
  145. CREDITS CREDITS for the amazing St. Patrick emoji art The

    internet for the memes! :D emojiart.org SPECIAL THANKS SPECIAL THANKS , , , , @mariocasciaro @machine_person @Podgeypoos79 @katavic_d @UrsoLuca @loige 104