Slide 1

Slide 1 text

Luciano Mammino (@loige) IT’S ABOUT TIME TO IT’S ABOUT TIME TO EMBRACE STREAMS EMBRACE STREAMS loige.link/streams-cityjs London - May 3, 2019 1

Slide 2

Slide 2 text

// buffer-copy.js const { readFileSync, writeFileSync } = require('fs') const [,, src, dest] = process.argv // read entire file content const content = readFileSync(src) // write that content somewhere else writeFileSync(dest, content) @loige 2

Slide 3

Slide 3 text

// buffer-copy.js const { readFileSync, writeFileSync } = require('fs') const [,, src, dest] = process.argv // read entire file content const content = readFileSync(src) // write that content somewhere else writeFileSync(dest, content) @loige 2

Slide 4

Slide 4 text

// buffer-copy.js const { readFileSync, writeFileSync } = require('fs') const [,, src, dest] = process.argv // read entire file content const content = readFileSync(src) // write that content somewhere else writeFileSync(dest, content) @loige 2

Slide 5

Slide 5 text

// buffer-copy.js const { readFileSync, writeFileSync } = require('fs') const [,, src, dest] = process.argv // read entire file content const content = readFileSync(src) // write that content somewhere else writeFileSync(dest, content) @loige 2

Slide 6

Slide 6 text

@loige 3

Slide 7

Slide 7 text

WE DO THIS ALL THE TIME WE DO THIS ALL THE TIME @loige 3

Slide 8

Slide 8 text

WE DO THIS ALL THE TIME WE DO THIS ALL THE TIME AND IT'S OK AND IT'S OK @loige 3

Slide 9

Slide 9 text

WE DO THIS ALL THE TIME WE DO THIS ALL THE TIME AND IT'S OK AND IT'S OK BUT SOMETIMES ... BUT SOMETIMES ... @loige 3

Slide 10

Slide 10 text

@loige ERR_FS_FILE_TOO_LARGE! ERR_FS_FILE_TOO_LARGE! File size is greater than possible Buffer 4

Slide 11

Slide 11 text

BUT WHY? BUT WHY? @loige 5

Slide 12

Slide 12 text

IF BYTES IF BYTES WERE WERE BLOCKS... BLOCKS... @loige 6

Slide 13

Slide 13 text

MARIO CAN LIFT MARIO CAN LIFT FEW BLOCKS FEW BLOCKS @loige 7

Slide 14

Slide 14 text

BUT NOT TOO MANY... BUT NOT TOO MANY... @loige ?! 8

Slide 15

Slide 15 text

WHAT CAN WE DO IF WE HAVE TO WHAT CAN WE DO IF WE HAVE TO MOVE MANY BLOCKS? MOVE MANY BLOCKS? @loige 9

Slide 16

Slide 16 text

WE CAN MOVE THEM ONE BY ONE! WE CAN MOVE THEM ONE BY ONE! @loige we stream them... 10

Slide 17

Slide 17 text

11

Slide 18

Slide 18 text

HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! 11

Slide 19

Slide 19 text

HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! 11

Slide 20

Slide 20 text

HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! 11

Slide 21

Slide 21 text

HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! 11

Slide 22

Slide 22 text

HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! Cloud Architect 11

Slide 23

Slide 23 text

HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! Cloud Architect Blog: Twitter: GitHub: loige.co @loige @lmammino 11

Slide 24

Slide 24 text

HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! Cloud Architect Blog: Twitter: GitHub: loige.co @loige @lmammino 11

Slide 25

Slide 25 text

code: loige.link/streams-examples loige.link/streams-cityjs 12

Slide 26

Slide 26 text

01. BUFFERS VS 01. BUFFERS VS STREAMS STREAMS @loige 13

Slide 27

Slide 27 text

BUFFER BUFFER: DATA STRUCTURE TO STORE AND : DATA STRUCTURE TO STORE AND TRANSFER ARBITRARY BINARY DATA TRANSFER ARBITRARY BINARY DATA @loige *Note that this is loading all the content of the file in memory * 14

Slide 28

Slide 28 text

STREAM STREAM: ABSTRACT INTERFACE FOR : ABSTRACT INTERFACE FOR WORKING WITH STREAMING DATA WORKING WITH STREAMING DATA @loige *It does not load all the data straight away * 15

Slide 29

Slide 29 text

FILE COPY: FILE COPY: THE BUFFER WAY THE BUFFER WAY @loige // buffer-copy.js const { readFileSync, writeFileSync } = require('fs') const [,, src, dest] = process.argv const content = readFileSync(src) writeFileSync(dest, content) 16

Slide 30

Slide 30 text

FILE COPY: FILE COPY: THE STREAM WAY THE STREAM WAY // stream-copy.js const { createReadStream, createWriteStream } = require('fs') const [,, src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', (data) => destStream.write(data)) @loige * Careful: this implementation is not optimal * 17

Slide 31

Slide 31 text

FILE COPY: FILE COPY: THE STREAM WAY THE STREAM WAY // stream-copy.js const { createReadStream, createWriteStream } = require('fs') const [,, src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', (data) => destStream.write(data)) @loige * Careful: this implementation is not optimal * 17

Slide 32

Slide 32 text

FILE COPY: FILE COPY: THE STREAM WAY THE STREAM WAY // stream-copy.js const { createReadStream, createWriteStream } = require('fs') const [,, src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', (data) => destStream.write(data)) @loige * Careful: this implementation is not optimal * 17

Slide 33

Slide 33 text

FILE COPY: FILE COPY: THE STREAM WAY THE STREAM WAY // stream-copy.js const { createReadStream, createWriteStream } = require('fs') const [,, src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', (data) => destStream.write(data)) @loige * Careful: this implementation is not optimal * 17

Slide 34

Slide 34 text

MEMORY COMPARISON (~600MB FILE) MEMORY COMPARISON (~600MB FILE) node ­­inspect­brk buffer­copy.js assets/poster.psd ~/Downloads/poster.psd @loige 18

Slide 35

Slide 35 text

MEMORY COMPARISON (~600MB FILE) MEMORY COMPARISON (~600MB FILE) node ­­inspect­brk stream­copy.js assets/poster.psd ~/Downloads/poster.psd @loige 19

Slide 36

Slide 36 text

LET'S TRY WITH A BIG FILE (~10GB) LET'S TRY WITH A BIG FILE (~10GB) node ­­inspect­brk stream­copy.js assets/the­matrix­hd.mkv ~/Downloads/the­matrix­hd.mkv @loige 20

Slide 37

Slide 37 text

STREAMS VS BUFFERS STREAMS VS BUFFERS Streams keep a low memory footprint even with large amounts of data Streams allows you to process data as soon as it arrives @loige 21

Slide 38

Slide 38 text

03. STREAM TYPES 03. STREAM TYPES & APIS & APIS @loige 22

Slide 39

Slide 39 text

ALL STREAMS ARE ALL STREAMS ARE EVENT EMITTERS EVENT EMITTERS A stream instance is an object that emits events when its internal state changes, for instance: s.on('readable', () => {}) // ready to be consumed s.on('data', (chunk) => {}) // new data is available s.on('error', (err) => {}) // some error happened s.on('end', () => {}) // no more data available The events available depend from the type of stream @loige 23

Slide 40

Slide 40 text

READABLE READABLE STREAMS STREAMS A readable stream represents a source from which data is consumed. Examples: fs readStream process.stdin HTTP response (client-side) HTTP request (server-side) AWS S3 GetObject (data field) It supports two modes for data consumption: flowing and paused (or non- flowing) mode. @loige 24

Slide 41

Slide 41 text

READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. @loige 25

Slide 42

Slide 42 text

@loige 1 2 3 Source data Readable stream in flowing mode data listener READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 26

Slide 43

Slide 43 text

@loige 1 2 3 Source data Readable stream in flowing mode Read data listener READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 27

Slide 44

Slide 44 text

@loige 1 2 3 Source data Readable stream in flowing mode data listener data READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 28

Slide 45

Slide 45 text

@loige 2 3 Source data Readable stream in flowing mode data listener Read READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 29

Slide 46

Slide 46 text

@loige 2 3 Source data Readable stream in flowing mode data listener data READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 30

Slide 47

Slide 47 text

@loige 3 Source data Readable stream in flowing mode data listener Read READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 31

Slide 48

Slide 48 text

@loige 3 Source data Readable stream in flowing mode data listener data READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 32

Slide 49

Slide 49 text

@loige Source data Readable stream in flowing mode Read data listener (end) READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 33

Slide 50

Slide 50 text

@loige Source data Readable stream in flowing mode data listener end (end) When no more data is available, end is emitted. READABLE STREAMS READABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 34

Slide 51

Slide 51 text

// count-emojis-flowing.js const { createReadStream } = require('fs') const { EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) @loige 35

Slide 52

Slide 52 text

// count-emojis-flowing.js const { createReadStream } = require('fs') const { EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) @loige 35

Slide 53

Slide 53 text

loige.link/st_patrick @loige 36

Slide 54

Slide 54 text

WRITABLE WRITABLE STREAMS STREAMS A writable stream is an abstraction that allows to write data over a destination Examples: fs writeStream process.stdout, process.stderr HTTP request (client-side) HTTP response (server-side) AWS S3 PutObject (body parameter) @loige 37

Slide 55

Slide 55 text

// writable-http-request.js const http = require('http') const req = http.request( { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...\n') req.end('last write & close the stream') @loige 38

Slide 56

Slide 56 text

// writable-http-request.js const http = require('http') const req = http.request( { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...\n') req.end('last write & close the stream') @loige 38

Slide 57

Slide 57 text

// writable-http-request.js const http = require('http') const req = http.request( { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...\n') req.end('last write & close the stream') @loige 38

Slide 58

Slide 58 text

// writable-http-request.js const http = require('http') const req = http.request( { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...\n') req.end('last write & close the stream') @loige 38

Slide 59

Slide 59 text

@loige 39

Slide 60

Slide 60 text

loige.link/writable-http-req @loige 40

Slide 61

Slide 61 text

BACKPRESSURE BACKPRESSURE When writing large amounts of data you should make sure you handle the stop write signal and the drain event loige.link/backpressure @loige 41

Slide 62

Slide 62 text

// stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) @loige 42

Slide 63

Slide 63 text

// stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) @loige 42

Slide 64

Slide 64 text

// stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) @loige 42

Slide 65

Slide 65 text

// stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) @loige 42

Slide 66

Slide 66 text

// stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) @loige 42

Slide 67

Slide 67 text

OTHER TYPES OF STREAM OTHER TYPES OF STREAM Duplex Stream streams that are both Readable and Writable. (net.Socket) Transform Stream Duplex streams that can modify or transform the data as it is written and read. (zlib.createGzip(), crypto.createCipheriv()) @loige 43

Slide 68

Slide 68 text

ANATOMY OF A TRANSFORM STREAM ANATOMY OF A TRANSFORM STREAM 1. write data transform stream 3. read transformed data 2. transform the data (readable stream) (writable stream) @loige 44

Slide 69

Slide 69 text

GZIP EXAMPLE GZIP EXAMPLE 1. write data transform stream 3. read transformed data 2. transform the data (readable stream) (writable stream) @loige Uncompressed data Compressed data compress zlib.createGzip() 45

Slide 70

Slide 70 text

gzipStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { gzipStream.pause() destStream.once('drain', () => { gzipStream.resume() }) } }) gzipStream.on('end', () => { destStream.end() }) // ⚠ TODO: handle errors! // stream-copy-gzip.js const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = gzipStream.write(data) if (!canContinue) { srcStream.pause() gzipStream.once('drain', () => { srcStream.resume() }) } }) srcStream.on('end', () => { // check if there's buffered data left const remainingData = gzipStream.read() if (remainingData !== null) { destStream.write() } gzipStream.end() }) @loige 46

Slide 71

Slide 71 text

gzipStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { gzipStream.pause() destStream.once('drain', () => { gzipStream.resume() }) } }) gzipStream.on('end', () => { destStream.end() }) // ⚠ TODO: handle errors! // stream-copy-gzip.js const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = gzipStream.write(data) if (!canContinue) { srcStream.pause() gzipStream.once('drain', () => { srcStream.resume() }) } }) srcStream.on('end', () => { // check if there's buffered data left const remainingData = gzipStream.read() if (remainingData !== null) { destStream.write() } gzipStream.end() }) @loige 46

Slide 72

Slide 72 text

03. PIPE() 03. PIPE() @loige 47

Slide 73

Slide 73 text

readable .pipe(tranform1) .pipe(transform2) .pipe(transform3) .pipe(writable) readable.pipe(writableDest) @loige Connects a readable stream to a writable stream A transform stream can be used as a destination as well It returns the destination stream allowing for a chain of pipes 48

Slide 74

Slide 74 text

// stream-copy-gzip-pipe.js const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream .pipe(gzipStream) .pipe(destStream) @loige 49

Slide 75

Slide 75 text

// stream-copy-gzip-pipe.js const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream .pipe(gzipStream) .pipe(destStream) @loige 49

Slide 76

Slide 76 text

// stream-copy-gzip-pipe.js const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream .pipe(gzipStream) .pipe(destStream) @loige 49

Slide 77

Slide 77 text

readable .pipe(decompress) .pipe(decrypt) .pipe(convert) .pipe(encrypt) .pipe(compress) .pipe(writeToDisk) Setup complex pipelines with pipe @loige This is the most common way to use streams 50

Slide 78

Slide 78 text

readable .on('error', handleErr) .pipe(decompress) .on('error', handleErr) .pipe(decrypt) .on('error', handleErr) .pipe(convert) .on('error', handleErr) .pipe(encrypt) .on('error', handleErr) .pipe(compress) .on('error', handleErr) .pipe(writeToDisk) .on('error', handleErr) Handling errors (correctly) @loige 51

Slide 79

Slide 79 text

readable .on('error', handleErr) .pipe(decompress) .on('error', handleErr) .pipe(decrypt) .on('error', handleErr) .pipe(convert) .on('error', handleErr) .pipe(encrypt) .on('error', handleErr) .pipe(compress) .on('error', handleErr) .pipe(writeToDisk) .on('error', handleErr) Handling errors (correctly) @loige handleErr should end and destroy the streams (it doesn't happen automatically) 51

Slide 80

Slide 80 text

readable .on('error', handleErr) .pipe(decompress) .on('error', handleErr) .pipe(decrypt) .on('error', handleErr) .pipe(convert) .on('error', handleErr) .pipe(encrypt) .on('error', handleErr) .pipe(compress) .on('error', handleErr) .pipe(writeToDisk) .on('error', handleErr) Handling errors (correctly) @loige handleErr should end and destroy the streams (it doesn't happen automatically) 51

Slide 81

Slide 81 text

04. STREAM UTILITIES 04. STREAM UTILITIES @loige 52

Slide 82

Slide 82 text

// stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) stream.pipeline(...streams, callback) @loige 53

Slide 83

Slide 83 text

// stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) stream.pipeline(...streams, callback) @loige Can pass multiple streams (they will be piped) 53

Slide 84

Slide 84 text

// stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) stream.pipeline(...streams, callback) @loige Can pass multiple streams (they will be piped) The last argument is a callback. If invoked with an error, it means the pipeline failed at some point. All the streams are ended and destroyed correctly. 53

Slide 85

Slide 85 text

readable-stream - Npm package that contains the latest version of Node.js stream library. It also makes Node.js streams compatible with the browser (can be used with Webpack and Broswserify) npm.im/readable-stream @loige * yeah, the name is misleading. The package offers all the functionalities in the official 'stream' package, not just readable streams. * 54

Slide 86

Slide 86 text

04. WRITING CUSTOM 04. WRITING CUSTOM STREAMS STREAMS @loige 55

Slide 87

Slide 87 text

@loige EmojiStream Uppercasify DOMAppend 56

Slide 88

Slide 88 text

@loige EmojiStream Uppercasify DOMAppend Lemon 56

Slide 89

Slide 89 text

@loige EmojiStream Uppercasify DOMAppend LEMON 56

Slide 90

Slide 90 text

@loige EmojiStream Uppercasify DOMAppend LEMON 56

Slide 91

Slide 91 text

@loige EmojiStream Uppercasify DOMAppend LEMON Banana 56

Slide 92

Slide 92 text

@loige EmojiStream Uppercasify DOMAppend LEMON BANANA 56

Slide 93

Slide 93 text

@loige EmojiStream Uppercasify DOMAppend LEMON BANANA 56

Slide 94

Slide 94 text

@loige EmojiStream Uppercasify DOMAppend LEMON class EmojiStream extends Readable { _read() { // ... } } BANANA 56

Slide 95

Slide 95 text

@loige EmojiStream Uppercasify DOMAppend LEMON class EmojiStream extends Readable { _read() { // ... } } class Uppercasify extends Transform { _transform( chunk, enc, done ) { // ... } } BANANA 56

Slide 96

Slide 96 text

@loige EmojiStream Uppercasify DOMAppend LEMON class EmojiStream extends Readable { _read() { // ... } } class Uppercasify extends Transform { _transform( chunk, enc, done ) { // ... } } class DOMAppend extends Writable { _write( chunk, enc, done ) { // ... } } BANANA 56

Slide 97

Slide 97 text

@loige EmojiStream Uppercasify DOMAppend LEMON class EmojiStream extends Readable { _read() { // ... } } class Uppercasify extends Transform { _transform( chunk, enc, done ) { // ... } } class DOMAppend extends Writable { _write( chunk, enc, done ) { // ... } } BANANA this.push(data) pass data to the next step 56

Slide 98

Slide 98 text

// emoji-stream.js (custom readable stream) const { EMOJI_MAP } = require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 57

Slide 99

Slide 99 text

// emoji-stream.js (custom readable stream) const { EMOJI_MAP } = require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 57

Slide 100

Slide 100 text

// emoji-stream.js (custom readable stream) const { EMOJI_MAP } = require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 57

Slide 101

Slide 101 text

// emoji-stream.js (custom readable stream) const { EMOJI_MAP } = require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 57

Slide 102

Slide 102 text

// emoji-stream.js (custom readable stream) const { EMOJI_MAP } = require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 57

Slide 103

Slide 103 text

// emoji-stream.js (custom readable stream) const { EMOJI_MAP } = require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 57

Slide 104

Slide 104 text

// uppercasify.js (custom transform stream) const { Transform } = require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } } module.exports = Uppercasify @loige 58

Slide 105

Slide 105 text

// uppercasify.js (custom transform stream) const { Transform } = require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } } module.exports = Uppercasify @loige 58

Slide 106

Slide 106 text

// uppercasify.js (custom transform stream) const { Transform } = require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } } module.exports = Uppercasify @loige 58

Slide 107

Slide 107 text

// uppercasify.js (custom transform stream) const { Transform } = require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } } module.exports = Uppercasify @loige 58

Slide 108

Slide 108 text

// uppercasify.js (custom transform stream) const { Transform } = require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } } module.exports = Uppercasify @loige 58

Slide 109

Slide 109 text

// uppercasify.js (custom transform stream) const { Transform } = require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } } module.exports = Uppercasify @loige 58

Slide 110

Slide 110 text

// dom-append.js (custom writable stream) const { Writable } = require('readable-stream') class DOMAppend extends Writable { _write (chunk, encoding, done) { const elem = document.createElement('li') const content = document.createTextNode(chunk.toString()) elem.appendChild(content) document.getElementById('list').appendChild(elem) done() } } module.exports = DOMAppend @loige 59

Slide 111

Slide 111 text

// dom-append.js (custom writable stream) const { Writable } = require('readable-stream') class DOMAppend extends Writable { _write (chunk, encoding, done) { const elem = document.createElement('li') const content = document.createTextNode(chunk.toString()) elem.appendChild(content) document.getElementById('list').appendChild(elem) done() } } module.exports = DOMAppend @loige 59

Slide 112

Slide 112 text

// dom-append.js (custom writable stream) const { Writable } = require('readable-stream') class DOMAppend extends Writable { _write (chunk, encoding, done) { const elem = document.createElement('li') const content = document.createTextNode(chunk.toString()) elem.appendChild(content) document.getElementById('list').appendChild(elem) done() } } module.exports = DOMAppend @loige 59

Slide 113

Slide 113 text

// dom-append.js (custom writable stream) const { Writable } = require('readable-stream') class DOMAppend extends Writable { _write (chunk, encoding, done) { const elem = document.createElement('li') const content = document.createTextNode(chunk.toString()) elem.appendChild(content) document.getElementById('list').appendChild(elem) done() } } module.exports = DOMAppend @loige 59

Slide 114

Slide 114 text

// dom-append.js (custom writable stream) const { Writable } = require('readable-stream') class DOMAppend extends Writable { _write (chunk, encoding, done) { const elem = document.createElement('li') const content = document.createTextNode(chunk.toString()) elem.appendChild(content) document.getElementById('list').appendChild(elem) done() } } module.exports = DOMAppend @loige 59

Slide 115

Slide 115 text

// dom-append.js (custom writable stream) const { Writable } = require('readable-stream') class DOMAppend extends Writable { _write (chunk, encoding, done) { const elem = document.createElement('li') const content = document.createTextNode(chunk.toString()) elem.appendChild(content) document.getElementById('list').appendChild(elem) done() } } module.exports = DOMAppend @loige 59

Slide 116

Slide 116 text

05. STREAMS IN THE 05. STREAMS IN THE BROWSER BROWSER @loige 60

Slide 117

Slide 117 text

// browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify') const DOMAppend = require('../dom-append') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend() emoji .pipe(uppercasify) .pipe(append) @loige 61

Slide 118

Slide 118 text

// browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify') const DOMAppend = require('../dom-append') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend() emoji .pipe(uppercasify) .pipe(append) @loige 61

Slide 119

Slide 119 text

// browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify') const DOMAppend = require('../dom-append') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend() emoji .pipe(uppercasify) .pipe(append) @loige 61

Slide 120

Slide 120 text

// browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify') const DOMAppend = require('../dom-append') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend() emoji .pipe(uppercasify) .pipe(append) @loige 61

Slide 121

Slide 121 text

npm i --save-dev webpack webpack-cli node_modules/.bin/webpack src/browser/app.js # creates dist/main.js mv dist/main.js src/browser/app-bundle.js @loige Let's use webpack to build this app for the browser 62

Slide 122

Slide 122 text

Streams in the browser!
    @loige Finally let's create an index.html 63

    Slide 123

    Slide 123 text

    Streams in the browser!
      @loige Finally let's create an index.html 63

      Slide 124

      Slide 124 text

      Streams in the browser!
        @loige Finally let's create an index.html 63

        Slide 125

        Slide 125 text

        @loige 64

        Slide 126

        Slide 126 text

        @loige 64

        Slide 127

        Slide 127 text

        06. CLOSING 06. CLOSING @loige 65

        Slide 128

        Slide 128 text

        Streams have low memory footprint Process data as soon as it's available Composition through pipelines Streams are abstractions: Readable = Input Transform = Business Logic Writable = Output @loige TLDR; TLDR; 66

        Slide 129

        Slide 129 text

        IF YOU WANT TO LEARN (EVEN) MOAR IF YOU WANT TO LEARN (EVEN) MOAR ABOUT STREAMS... ABOUT STREAMS... nodejs.org/api/stream.html github.com/substack/stream-handbook @loige 67

        Slide 130

        Slide 130 text

        IF YOU ARE NOT CONVINCED YET... IF YOU ARE NOT CONVINCED YET... @loige curl parrot.live 68

        Slide 131

        Slide 131 text

        @loige github.com/hugomd/parrot.live Check out the codebase 69

        Slide 132

        Slide 132 text

        @loige 70

        Slide 133

        Slide 133 text

        @loige THANKS! THANKS! loige.link/streams-cityjs We are hiring, talk to me! :) 71

        Slide 134

        Slide 134 text

        CREDITS CREDITS on for the cover picture for the amazing St. Patrick emoji art The internet for the memes! :D Dan Roizer Unsplash emojiart.org SPECIAL THANKS SPECIAL THANKS , , , , , @StefanoAbalsamo @mariocasciaro @machine_person @Podgeypoos79 @katavic_d @UrsoLuca @loige 72