Slide 1

Slide 1 text

Luciano Mammino (@loige) IT’S ABOUT TIME TO IT’S ABOUT TIME TO EMBRACE STREAMS EMBRACE STREAMS IT’S ABOUT TIME TO IT’S ABOUT TIME TO EMBRACE STREAMS EMBRACE STREAMS Luciano Mammino (@loige) 05/03/2019 loige.link/streams-dub 1

Slide 2

Slide 2 text

loige.link/streams-dub code: loige.link/streams-examples 2

Slide 3

Slide 3 text

HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! Cloud Architect Let's connect! Blog: Twitter: GitHub: loige.co @loige @lmammino loige.link/node-patterns fstack.link with @mariocasciaro with @andreaman87 3

Slide 4

Slide 4 text

AGENDA AGENDA 01. Buffers VS Streams 02. Stream types & APIs 03. Pipe() 04. Streams utilities 05. Writing custom streams 06. Streams in the Browser @loige 4

Slide 5

Slide 5 text

01. BUFFERS VS 01. BUFFERS VS STREAMS STREAMS @loige 5

Slide 6

Slide 6 text

BUFFER BUFFER: DATA STRUCTURE TO STORE AND : DATA STRUCTURE TO STORE AND TRANSFER ARBITRARY BINARY DATA TRANSFER ARBITRARY BINARY DATA @loige *Note that this is loading all the content of the file in memory * 6

Slide 7

Slide 7 text

STREAM STREAM: ABSTRACT INTERFACE FOR : ABSTRACT INTERFACE FOR WORKING WITH STREAMING DATA WORKING WITH STREAMING DATA @loige *It does not load all the data straight away * 7

Slide 8

Slide 8 text

LET'S SOLVE THIS PROBLEM LET'S SOLVE THIS PROBLEM 1. Read the content of a file 2. copy it to another file* * cp in Node.js @loige 8

Slide 9

Slide 9 text

THE BUFFER WAY THE BUFFER WAY // buffer-copy.js const { readFileSync, writeFileSync } = require('fs') const [,, src, dest] = process.argv const content = readFileSync(src) writeFileSync(dest, content) @loige 9

Slide 10

Slide 10 text

THE STREAM WAY THE STREAM WAY // stream-copy.js const { createReadStream, createWriteStream } = require('fs') const [,, src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', (data) => destStream.write(data)) @loige * Careful: this implementation is not optimal * 10

Slide 11

Slide 11 text

MEMORY COMPARISON (~600MB FILE) MEMORY COMPARISON (~600MB FILE) node ­­inspect­brk buffer­copy.js assets/poster.psd ~/Downloads/poster.psd @loige 11

Slide 12

Slide 12 text

MEMORY COMPARISON (~600MB FILE) MEMORY COMPARISON (~600MB FILE) node ­­inspect­brk stream­copy.js assets/poster.psd ~/Downloads/poster.psd @loige 12

Slide 13

Slide 13 text

LET'S TRY WITH A BIG FILE (~10GB) LET'S TRY WITH A BIG FILE (~10GB) @loige 13

Slide 14

Slide 14 text

LET'S TRY WITH A BIG FILE (~10GB) LET'S TRY WITH A BIG FILE (~10GB) node ­­inspect­brk stream­copy.js assets/the­matrix­hd.mkv ~/Downloads/the­matrix­hd.mkv @loige 14

Slide 15

Slide 15 text

IF BYTES IF BYTES WERE WERE BLOCKS... BLOCKS... @loige 15

Slide 16

Slide 16 text

BIG BUFFER APPROACH BIG BUFFER APPROACH @loige 16

Slide 17

Slide 17 text

STREAMING APPROACH STREAMING APPROACH @loige 17

Slide 18

Slide 18 text

STREAMS VS BUFFERS STREAMS VS BUFFERS Streams keep a low memory footprint even with large amounts of data Streams allows you to process data as soon as it arrives Stream processing generally does not block the event loop @loige 18

Slide 19

Slide 19 text

03. STREAM TYPES 03. STREAM TYPES & APIS & APIS @loige 19

Slide 20

Slide 20 text

ALL STREAMS ARE ALL STREAMS ARE EVENT EMITTERS EVENT EMITTERS A stream instance is an object that emits events when its internal state changes, for instance: s.on('readable', () => {}) // ready to be consumed s.on('data', (chunk) => {}) // new data is available s.on('error', (err) => {}) // some error happened s.on('end', () => {}) // no more data available The events available depend from the type of stream @loige 20

Slide 21

Slide 21 text

WRITABLE WRITABLE STREAMS STREAMS A writable stream is an abstraction that allows to write data over a destination Examples: fs writeStream process.stdout, process.stderr HTTP request (client-side) HTTP response (server-side) AWS S3 PutObject (body parameter) @loige 21

Slide 22

Slide 22 text

WRITABLE STREAMS - WRITABLE STREAMS - METHODS METHODS writable.write(chunk, [encoding], [callback]) writable.end([chunk], [encoding], [callback]) @loige 22

Slide 23

Slide 23 text

writable.on('drain') writable.on('close') writable.on('finish') writable.on('error', (err) => {}) WRITABLE STREAMS - WRITABLE STREAMS - EVENTS EVENTS @loige 23

Slide 24

Slide 24 text

// writable-http-request.js const http = require('http') const req = http.request( { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...\n') req.end('last write & close the stream') @loige 24

Slide 25

Slide 25 text

// writable-http-request.js const http = require('http') const req = http.request( { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...\n') req.end('last write & close the stream') @loige 24

Slide 26

Slide 26 text

// writable-http-request.js const http = require('http') const req = http.request( { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...\n') req.end('last write & close the stream') @loige 24

Slide 27

Slide 27 text

// writable-http-request.js const http = require('http') const req = http.request( { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...\n') req.end('last write & close the stream') @loige 24

Slide 28

Slide 28 text

@loige 25

Slide 29

Slide 29 text

loige.link/writable-http-req @loige 26

Slide 30

Slide 30 text

BACKPRESSURE BACKPRESSURE When writing large amounts of data you should make sure you handle the stop write signal and the drain event loige.link/backpressure @loige 27

Slide 31

Slide 31 text

// stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) @loige 28

Slide 32

Slide 32 text

// stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) @loige 28

Slide 33

Slide 33 text

// stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) @loige 28

Slide 34

Slide 34 text

// stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) @loige 28

Slide 35

Slide 35 text

// stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) @loige 28

Slide 36

Slide 36 text

READABLE READABLE STREAMS STREAMS A readable stream represents a source from which data is consumed. Examples: fs readStream process.stdin HTTP response (client-side) HTTP request (server-side) AWS S3 GetObject (data field) It supports two modes for data consumption: flowing and paused (or non- flowing) mode. @loige 29

Slide 37

Slide 37 text

READABLE STREAMS - READABLE STREAMS - METHODS METHODS readable.read([size]) readable.pause() readable.resume() @loige 30

Slide 38

Slide 38 text

READABLE STREAMS - READABLE STREAMS - EVENTS EVENTS readable.on('readable') readable.on('data', (chunk) => {}) readable.on('end') readable.on('error', (err) => {}) @loige 31

Slide 39

Slide 39 text

READABLE STREAMS - READABLE STREAMS - FLOWING MODE FLOWING MODE Data is read from source automatically and chunks are emitted as soon as they are available. @loige 32

Slide 40

Slide 40 text

@loige 1 2 3 Source data Readable stream in flowing mode data listener READABLE STREAMS - READABLE STREAMS - FLOWING MODE FLOWING MODE Data is read from source automatically and chunks are emitted as soon as they are available. 33

Slide 41

Slide 41 text

@loige 1 2 3 Source data Readable stream in flowing mode Read data listener READABLE STREAMS - READABLE STREAMS - FLOWING MODE FLOWING MODE Data is read from source automatically and chunks are emitted as soon as they are available. 34

Slide 42

Slide 42 text

@loige 1 2 3 Source data Readable stream in flowing mode data listener data READABLE STREAMS - READABLE STREAMS - FLOWING MODE FLOWING MODE Data is read from source automatically and chunks are emitted as soon as they are available. 35

Slide 43

Slide 43 text

@loige 2 3 Source data Readable stream in flowing mode data listener Read READABLE STREAMS - READABLE STREAMS - FLOWING MODE FLOWING MODE Data is read from source automatically and chunks are emitted as soon as they are available. 36

Slide 44

Slide 44 text

@loige 2 3 Source data Readable stream in flowing mode data listener data READABLE STREAMS - READABLE STREAMS - FLOWING MODE FLOWING MODE Data is read from source automatically and chunks are emitted as soon as they are available. 37

Slide 45

Slide 45 text

@loige 3 Source data Readable stream in flowing mode data listener Read READABLE STREAMS - READABLE STREAMS - FLOWING MODE FLOWING MODE Data is read from source automatically and chunks are emitted as soon as they are available. 38

Slide 46

Slide 46 text

@loige 3 Source data Readable stream in flowing mode data listener data READABLE STREAMS - READABLE STREAMS - FLOWING MODE FLOWING MODE Data is read from source automatically and chunks are emitted as soon as they are available. 39

Slide 47

Slide 47 text

@loige Source data Readable stream in flowing mode Read data listener (end) READABLE STREAMS - READABLE STREAMS - FLOWING MODE FLOWING MODE Data is read from source automatically and chunks are emitted as soon as they are available. 40

Slide 48

Slide 48 text

@loige Source data Readable stream in flowing mode data listener end (end) When no more data is available, end is emitted. READABLE STREAMS - READABLE STREAMS - FLOWING MODE FLOWING MODE Data is read from source automatically and chunks are emitted as soon as they are available. 41

Slide 49

Slide 49 text

// count-emojis-flowing.js const { createReadStream } = require('fs') const { EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) @loige 42

Slide 50

Slide 50 text

// count-emojis-flowing.js const { createReadStream } = require('fs') const { EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) @loige 42

Slide 51

Slide 51 text

loige.link/st_patrick @loige 43

Slide 52

Slide 52 text

READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. @loige 44

Slide 53

Slide 53 text

@loige Source data Readable stream in paused mode consumer 1 2 3 < > (internal buffer) READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 45

Slide 54

Slide 54 text

@loige Source data consumer 1 2 3 < > (internal buffer) Read Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 46

Slide 55

Slide 55 text

@loige Source data consumer 1 2 3 < > (internal buffer) readable Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 47

Slide 56

Slide 56 text

@loige Source data consumer 1 2 3 < > (internal buffer) Nothing happens until the consumer decides to read the data Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 48

Slide 57

Slide 57 text

< > @loige Source data consumer 1 2 3 (internal buffer) read() Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 49

Slide 58

Slide 58 text

< > @loige Source data consumer 2 3 (internal buffer) Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 50

Slide 59

Slide 59 text

< > @loige Source data consumer null 2 3 (internal buffer) read() Using read() with an empty buffer will return null (stop reading signal) Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 51

Slide 60

Slide 60 text

< > @loige Source data consumer 2 3 (internal buffer) Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 52

Slide 61

Slide 61 text

< > @loige Source data consumer 2 3 (internal buffer) Read Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 53

Slide 62

Slide 62 text

< > @loige Source data consumer 2 3 (internal buffer) readable Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 54

Slide 63

Slide 63 text

< > @loige Source data consumer 2 3 (internal buffer) Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 55

Slide 64

Slide 64 text

< > @loige Source data consumer 2 3 (internal buffer) read() Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 56

Slide 65

Slide 65 text

< > @loige Source data consumer 3 (internal buffer) Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 57

Slide 66

Slide 66 text

< > @loige Source data consumer 3 (internal buffer) Read Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 58

Slide 67

Slide 67 text

< > @loige Source data consumer 3 (internal buffer) readable Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 59

Slide 68

Slide 68 text

< > @loige Source data consumer 3 (internal buffer) Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 60

Slide 69

Slide 69 text

< > @loige Source data consumer 3 (internal buffer) read() Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 61

Slide 70

Slide 70 text

< > @loige Source data consumer (internal buffer) Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 62

Slide 71

Slide 71 text

< > @loige Source data consumer (internal buffer) Read (end) Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 63

Slide 72

Slide 72 text

< > @loige Source data consumer (internal buffer) (end) end Readable stream in paused mode READABLE STREAMS - READABLE STREAMS - PAUSED MODE PAUSED MODE A consumer has to call the read method explicitly to read chunks of data from the stream. The stream sends a readable event to signal that new data is available. 64

Slide 73

Slide 73 text

// count-emojis-paused.js const { createReadStream } = require('fs') const { EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('readable', () => { let chunk while ((chunk = file.read()) !== null) { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) @loige 65

Slide 74

Slide 74 text

// count-emojis-paused.js const { createReadStream } = require('fs') const { EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('readable', () => { let chunk while ((chunk = file.read()) !== null) { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) @loige 65

Slide 75

Slide 75 text

// count-emojis-paused.js const { createReadStream } = require('fs') const { EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('readable', () => { let chunk while ((chunk = file.read()) !== null) { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) @loige 65

Slide 76

Slide 76 text

@loige 66

Slide 77

Slide 77 text

READABLE STREAMS READABLE STREAMS MODE SWITCH CONDITIONS MODE SWITCH CONDITIONS All readable streams are created in paused mode paused streams can be switched to flowing mode with: stream.on('data', () => {}) stream.resume() stream.pipe() flowing streams can switch back to paused with: stream.pause() stream.unpipe() for all attached streams @loige 67

Slide 78

Slide 78 text

READABLE STREAMS READABLE STREAMS FLOWING FLOWING VS VS PAUSED PAUSED Push VS Pull mental models Flowing is simpler to use Paused gives you more control on how data is consumed from the source Whichever you pick, stay consistent! @loige 68 . 1

Slide 79

Slide 79 text

BONUS MODE BONUS MODE Readable streams are also Async Iterators (Node.js 10+) Warning: still experimental @loige 68 . 2

Slide 80

Slide 80 text

// count-emojis-async-iterator.js const { createReadStream } = require('fs') const { EMOJI_MAP } = require('emoji') // from npm async function main () { const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 for await (let chunk of file) { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } } console.log(`Found ${counter} emojis`) } main() @loige 68 . 3

Slide 81

Slide 81 text

// count-emojis-async-iterator.js const { createReadStream } = require('fs') const { EMOJI_MAP } = require('emoji') // from npm async function main () { const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 for await (let chunk of file) { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } } console.log(`Found ${counter} emojis`) } main() @loige 68 . 3

Slide 82

Slide 82 text

Still experimental in Node.js 11 If you like this API and don't want to rely on an experimental core feature: 2ality.com/2018/04/async-iter-nodejs.html github.com/lorenzofox3/for-await @loige 68 . 4

Slide 83

Slide 83 text

OBJECT MODE OBJECT MODE Readable streams can emit objects if this mode is enabled // readable-timer.js const { Readable } = require('stream') const timerStream = new Readable({ objectMode: true, read () { this.push(new Date()) } }) timerStream.on('data', (currentDate) => { // prints the current second console.log(currentDate.getSeconds()) }) @loige 69

Slide 84

Slide 84 text

OBJECT MODE OBJECT MODE Readable streams can emit objects if this mode is enabled // readable-timer.js const { Readable } = require('stream') const timerStream = new Readable({ objectMode: true, read () { this.push(new Date()) } }) timerStream.on('data', (currentDate) => { // prints the current second console.log(currentDate.getSeconds()) }) @loige 69

Slide 85

Slide 85 text

OBJECT MODE OBJECT MODE Readable streams can emit objects if this mode is enabled // readable-timer.js const { Readable } = require('stream') const timerStream = new Readable({ objectMode: true, read () { this.push(new Date()) } }) timerStream.on('data', (currentDate) => { // prints the current second console.log(currentDate.getSeconds()) }) @loige This is an object 69

Slide 86

Slide 86 text

OBJECT MODE OBJECT MODE Readable streams can emit objects if this mode is enabled // readable-timer.js const { Readable } = require('stream') const timerStream = new Readable({ objectMode: true, read () { this.push(new Date()) } }) timerStream.on('data', (currentDate) => { // prints the current second console.log(currentDate.getSeconds()) }) @loige 69

Slide 87

Slide 87 text

OTHER TYPES OF STREAM OTHER TYPES OF STREAM Duplex Stream streams that are both Readable and Writable. (net.Socket) Transform Stream Duplex streams that can modify or transform the data as it is written and read. (zlib.createGzip(), crypto.createCipheriv()) @loige 70

Slide 88

Slide 88 text

ANATOMY OF A TRANSFORM STREAM ANATOMY OF A TRANSFORM STREAM 1. write data transform stream 3. read transformed data 2. transform the data (readable stream) (writable stream) @loige 71

Slide 89

Slide 89 text

GZIP EXAMPLE GZIP EXAMPLE 1. write data transform stream 3. read transformed data 2. transform the data (readable stream) (writable stream) @loige Uncompressed data Compressed data compress gzip.createGzip() 72

Slide 90

Slide 90 text

gzipStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { gzipStream.pause() destStream.once('drain', () => { gzipStream.resume() }) } }) gzipStream.on('end', () => { destStream.end() }) // TODO: handle errors! >:) // stream-copy-gzip.js const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = gzipStream.write(data) if (!canContinue) { srcStream.pause() gzipStream.once('drain', () => { srcStream.resume() }) } }) srcStream.on('end', () => { // check if there's buffered data left const remainingData = gzipStream.read() if (remainingData !== null) { destStream.write() } gzipStream.end() }) @loige 73

Slide 91

Slide 91 text

gzipStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { gzipStream.pause() destStream.once('drain', () => { gzipStream.resume() }) } }) gzipStream.on('end', () => { destStream.end() }) // TODO: handle errors! >:) // stream-copy-gzip.js const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = gzipStream.write(data) if (!canContinue) { srcStream.pause() gzipStream.once('drain', () => { srcStream.resume() }) } }) srcStream.on('end', () => { // check if there's buffered data left const remainingData = gzipStream.read() if (remainingData !== null) { destStream.write() } gzipStream.end() }) @loige 73

Slide 92

Slide 92 text

03. PIPE() 03. PIPE() @loige 74

Slide 93

Slide 93 text

readable .pipe(tranform1) .pipe(transform2) .pipe(transform3) .pipe(writable) readable.pipe(writableDest) @loige Connects a readable stream to a writable stream A transform stream can be used as a destination as well It returns the destination stream allowing for a chain of pipes 75

Slide 94

Slide 94 text

// stream-copy-gzip-pipe.js const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream .pipe(gzipStream) .pipe(destStream) @loige 76

Slide 95

Slide 95 text

// stream-copy-gzip-pipe.js const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream .pipe(gzipStream) .pipe(destStream) @loige 76

Slide 96

Slide 96 text

// stream-copy-gzip-pipe.js const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream .pipe(gzipStream) .pipe(destStream) @loige 76

Slide 97

Slide 97 text

readable .pipe(decompress) .pipe(decrypt) .pipe(convert) .pipe(encrypt) .pipe(compress) .pipe(writeToDisk) Setup complex pipelines with pipe @loige This is the most common way to use streams 77

Slide 98

Slide 98 text

readable .on('error', handleErr) .pipe(decompress) .on('error', handleErr) .pipe(decrypt) .on('error', handleErr) .pipe(convert) .on('error', handleErr) .pipe(encrypt) .on('error', handleErr) .pipe(compress) .on('error', handleErr) .pipe(writeToDisk) .on('error', handleErr) Handling errors (correctly) @loige 78

Slide 99

Slide 99 text

readable .on('error', handleErr) .pipe(decompress) .on('error', handleErr) .pipe(decrypt) .on('error', handleErr) .pipe(convert) .on('error', handleErr) .pipe(encrypt) .on('error', handleErr) .pipe(compress) .on('error', handleErr) .pipe(writeToDisk) .on('error', handleErr) Handling errors (correctly) @loige handleErr should end and destroy the streams (it doesn't happen automatically) 78

Slide 100

Slide 100 text

readable .on('error', handleErr) .pipe(decompress) .on('error', handleErr) .pipe(decrypt) .on('error', handleErr) .pipe(convert) .on('error', handleErr) .pipe(encrypt) .on('error', handleErr) .pipe(compress) .on('error', handleErr) .pipe(writeToDisk) .on('error', handleErr) Handling errors (correctly) @loige handleErr should end and destroy the streams (it doesn't happen automatically) 78

Slide 101

Slide 101 text

04. STREAM UTILITIES 04. STREAM UTILITIES @loige 79

Slide 102

Slide 102 text

// stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) stream.pipeline(...streams, callback) @loige 80

Slide 103

Slide 103 text

// stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) stream.pipeline(...streams, callback) @loige Can pass multiple streams (they will be piped) 80

Slide 104

Slide 104 text

// stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) stream.pipeline(...streams, callback) @loige Can pass multiple streams (they will be piped) The last argument is a callback. If invoked with an error, it means the pipeline failed at some point. All the streams are ended and destroyed correctly. 80

Slide 105

Slide 105 text

// stream-copy-gzip-pump.js const pump = require('pump') // from npm const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pump( // just swap pipeline with pump! createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) stream.pipeline is available in Node.js 10+ In older systems you can use pump - npm.im/pump @loige 81

Slide 106

Slide 106 text

// stream-copy-gzip-pump.js const pump = require('pump') // from npm const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pump( // just swap pipeline with pump! createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) stream.pipeline is available in Node.js 10+ In older systems you can use pump - npm.im/pump @loige 81

Slide 107

Slide 107 text

pumpify(...streams) - Create reusable pieces of pipeline npm.im/pumpify @loige Let's create EncGz, an application that helps us to read and write encrypted- gzipped files 82

Slide 108

Slide 108 text

// encgz-stream.js - utility library const { createCipheriv, createDecipheriv, randomBytes, createHash } = require('crypto') const { createGzip, createGunzip } = require('zlib') const pumpify = require('pumpify') // from npm // calculates md5 of the secret (trimmed) function getChiperKey (secret) {} function createEncgz (secret) { const initVect = randomBytes(16) const cipherKey = getChiperKey(secret) const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() const stream = pumpify(encryptStream, gzipStream) stream.initVect = initVect return stream } @loige 83

Slide 109

Slide 109 text

// encgz-stream.js - utility library const { createCipheriv, createDecipheriv, randomBytes, createHash } = require('crypto') const { createGzip, createGunzip } = require('zlib') const pumpify = require('pumpify') // from npm // calculates md5 of the secret (trimmed) function getChiperKey (secret) {} function createEncgz (secret) { const initVect = randomBytes(16) const cipherKey = getChiperKey(secret) const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() const stream = pumpify(encryptStream, gzipStream) stream.initVect = initVect return stream } @loige 83

Slide 110

Slide 110 text

// encgz-stream.js - utility library const { createCipheriv, createDecipheriv, randomBytes, createHash } = require('crypto') const { createGzip, createGunzip } = require('zlib') const pumpify = require('pumpify') // from npm // calculates md5 of the secret (trimmed) function getChiperKey (secret) {} function createEncgz (secret) { const initVect = randomBytes(16) const cipherKey = getChiperKey(secret) const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() const stream = pumpify(encryptStream, gzipStream) stream.initVect = initVect return stream } @loige 83

Slide 111

Slide 111 text

// encgz-stream.js - utility library const { createCipheriv, createDecipheriv, randomBytes, createHash } = require('crypto') const { createGzip, createGunzip } = require('zlib') const pumpify = require('pumpify') // from npm // calculates md5 of the secret (trimmed) function getChiperKey (secret) {} function createEncgz (secret) { const initVect = randomBytes(16) const cipherKey = getChiperKey(secret) const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() const stream = pumpify(encryptStream, gzipStream) stream.initVect = initVect return stream } @loige 83

Slide 112

Slide 112 text

// encgz-stream.js (...continue from previous slide) function createDecgz (secret, initVect) { const cipherKey = getChiperKey(secret) const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() const stream = pumpify(gunzipStream, decryptStream) return stream } module.exports = { createEncgz, createDecgz } @loige 84

Slide 113

Slide 113 text

// encgz-stream.js (...continue from previous slide) function createDecgz (secret, initVect) { const cipherKey = getChiperKey(secret) const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() const stream = pumpify(gunzipStream, decryptStream) return stream } module.exports = { createEncgz, createDecgz } @loige 84

Slide 114

Slide 114 text

// encgz-stream.js (...continue from previous slide) function createDecgz (secret, initVect) { const cipherKey = getChiperKey(secret) const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() const stream = pumpify(gunzipStream, decryptStream) return stream } module.exports = { createEncgz, createDecgz } @loige 84

Slide 115

Slide 115 text

// encgz-stream.js (...continue from previous slide) function createDecgz (secret, initVect) { const cipherKey = getChiperKey(secret) const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() const stream = pumpify(gunzipStream, decryptStream) return stream } module.exports = { createEncgz, createDecgz } @loige 84

Slide 116

Slide 116 text

// encgz-stream.js (...continue from previous slide) function createDecgz (secret, initVect) { const cipherKey = getChiperKey(secret) const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() const stream = pumpify(gunzipStream, decryptStream) return stream } module.exports = { createEncgz, createDecgz } @loige 84

Slide 117

Slide 117 text

// encgz.js - CLI to encrypt and gzip (from stdin to stdout) const { pipeline } = require('stream') const { createEncgz } = require('./encgz-stream') const [, , secret] = process.argv const encgz = createEncgz(secret) console.error(`init vector: ${encgz.initVect.toString('hex')}`) pipeline( process.stdin, encgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) @loige 85

Slide 118

Slide 118 text

// encgz.js - CLI to encrypt and gzip (from stdin to stdout) const { pipeline } = require('stream') const { createEncgz } = require('./encgz-stream') const [, , secret] = process.argv const encgz = createEncgz(secret) console.error(`init vector: ${encgz.initVect.toString('hex')}`) pipeline( process.stdin, encgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) @loige 85

Slide 119

Slide 119 text

// encgz.js - CLI to encrypt and gzip (from stdin to stdout) const { pipeline } = require('stream') const { createEncgz } = require('./encgz-stream') const [, , secret] = process.argv const encgz = createEncgz(secret) console.error(`init vector: ${encgz.initVect.toString('hex')}`) pipeline( process.stdin, encgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) @loige 85

Slide 120

Slide 120 text

// encgz.js - CLI to encrypt and gzip (from stdin to stdout) const { pipeline } = require('stream') const { createEncgz } = require('./encgz-stream') const [, , secret] = process.argv const encgz = createEncgz(secret) console.error(`init vector: ${encgz.initVect.toString('hex')}`) pipeline( process.stdin, encgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) @loige 85

Slide 121

Slide 121 text

// decgz.js - CLI to gunzip and decrypt (from stdin to stdout) const { pipeline } = require('stream') const { createDecgz } = require('./encgz-stream') const [, , secret, initVect] = process.argv const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) pipeline( process.stdin, decgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) @loige 86

Slide 122

Slide 122 text

// decgz.js - CLI to gunzip and decrypt (from stdin to stdout) const { pipeline } = require('stream') const { createDecgz } = require('./encgz-stream') const [, , secret, initVect] = process.argv const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) pipeline( process.stdin, decgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) @loige 86

Slide 123

Slide 123 text

// decgz.js - CLI to gunzip and decrypt (from stdin to stdout) const { pipeline } = require('stream') const { createDecgz } = require('./encgz-stream') const [, , secret, initVect] = process.argv const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) pipeline( process.stdin, decgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) @loige 86

Slide 124

Slide 124 text

// decgz.js - CLI to gunzip and decrypt (from stdin to stdout) const { pipeline } = require('stream') const { createDecgz } = require('./encgz-stream') const [, , secret, initVect] = process.argv const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) pipeline( process.stdin, decgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) @loige 86

Slide 125

Slide 125 text

@loige 87

Slide 126

Slide 126 text

stream.finished(streams, callback) Get notified when a stream is not readable, writable anymore - Node.js 10+ // finished can be promisified! const finished = util.promisify(stream.finished) const rs = fs.createReadStream('archive.tar') async function run() { await finished(rs) console.log('Stream is done reading.') } run().catch(console.error) rs.resume() // start & drain the stream @loige 88

Slide 127

Slide 127 text

stream.finished(streams, callback) Get notified when a stream is not readable, writable anymore - Node.js 10+ // finished can be promisified! const finished = util.promisify(stream.finished) const rs = fs.createReadStream('archive.tar') async function run() { await finished(rs) console.log('Stream is done reading.') } run().catch(console.error) rs.resume() // start & drain the stream @loige 88

Slide 128

Slide 128 text

stream.finished(streams, callback) Get notified when a stream is not readable, writable anymore - Node.js 10+ // finished can be promisified! const finished = util.promisify(stream.finished) const rs = fs.createReadStream('archive.tar') async function run() { await finished(rs) console.log('Stream is done reading.') } run().catch(console.error) rs.resume() // start & drain the stream @loige 88

Slide 129

Slide 129 text

readable-stream - Npm package that contains the latest version of Node.js stream library. It also makes Node.js streams compatible with the browser (can be used with Webpack and Broswserify) npm.im/readable-stream @loige * yeah, the name is misleading. The package offers all the functionalities in the official 'stream' package, not just readable streams. * 89

Slide 130

Slide 130 text

04. WRITING CUSTOM 04. WRITING CUSTOM STREAMS STREAMS @loige 90

Slide 131

Slide 131 text

// emoji-stream.js (custom readable stream) const { EMOJI_MAP } = require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 91

Slide 132

Slide 132 text

// emoji-stream.js (custom readable stream) const { EMOJI_MAP } = require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 91

Slide 133

Slide 133 text

// emoji-stream.js (custom readable stream) const { EMOJI_MAP } = require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 91

Slide 134

Slide 134 text

// emoji-stream.js (custom readable stream) const { EMOJI_MAP } = require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 91

Slide 135

Slide 135 text

// emoji-stream.js (custom readable stream) const { EMOJI_MAP } = require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 91

Slide 136

Slide 136 text

// emoji-stream.js (custom readable stream) const { EMOJI_MAP } = require('emoji') // from npm const { Readable } = require('readable-stream') // from npm const emojis = Object.keys(EMOJI_MAP) function getEmojiDescription (index) { return EMOJI_MAP[emojis[index]][1] } function getMessage (index) { return emojis[index] + ' ' + getEmojiDescription(index) } class EmojiStream extends Readable { constructor (options) { super(options) this._index = 0 } _read () { if (this._index >= emojis.length) { return this.push(null) } return this.push(getMessage(this._index++)) } } module.exports = EmojiStream @loige 91

Slide 137

Slide 137 text

// uppercasify.js (custom transform stream) const { Transform } = require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } _flush () { // in case there's buffered data // that still have to be pushed } } module.exports = Uppercasify @loige 92

Slide 138

Slide 138 text

// uppercasify.js (custom transform stream) const { Transform } = require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } _flush () { // in case there's buffered data // that still have to be pushed } } module.exports = Uppercasify @loige 92

Slide 139

Slide 139 text

// uppercasify.js (custom transform stream) const { Transform } = require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } _flush () { // in case there's buffered data // that still have to be pushed } } module.exports = Uppercasify @loige 92

Slide 140

Slide 140 text

// uppercasify.js (custom transform stream) const { Transform } = require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } _flush () { // in case there's buffered data // that still have to be pushed } } module.exports = Uppercasify @loige 92

Slide 141

Slide 141 text

// uppercasify.js (custom transform stream) const { Transform } = require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } _flush () { // in case there's buffered data // that still have to be pushed } } module.exports = Uppercasify @loige 92

Slide 142

Slide 142 text

// uppercasify.js (custom transform stream) const { Transform } = require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } _flush () { // in case there's buffered data // that still have to be pushed } } module.exports = Uppercasify @loige 92

Slide 143

Slide 143 text

// uppercasify.js (custom transform stream) const { Transform } = require('readable-stream') class Uppercasify extends Transform { _transform (chunk, encoding, done) { this.push(chunk.toString().toUpperCase()) done() } _flush () { // in case there's buffered data // that still have to be pushed } } module.exports = Uppercasify @loige 92

Slide 144

Slide 144 text

// dom-append.js (custom writable stream) const { Writable } = require('readable-stream') class DOMAppend extends Writable { constructor (target, tag = 'p', options) { super(options) this._target = target this._tag = tag } _write (chunk, encoding, done) { const elem = document.createElement(this._tag) const content = document.createTextNode(chunk.toString()) elem.appendChild(content) this._target.appendChild(elem) done() } } module.exports = DOMAppend @loige 93

Slide 145

Slide 145 text

// dom-append.js (custom writable stream) const { Writable } = require('readable-stream') class DOMAppend extends Writable { constructor (target, tag = 'p', options) { super(options) this._target = target this._tag = tag } _write (chunk, encoding, done) { const elem = document.createElement(this._tag) const content = document.createTextNode(chunk.toString()) elem.appendChild(content) this._target.appendChild(elem) done() } } module.exports = DOMAppend @loige 93

Slide 146

Slide 146 text

// dom-append.js (custom writable stream) const { Writable } = require('readable-stream') class DOMAppend extends Writable { constructor (target, tag = 'p', options) { super(options) this._target = target this._tag = tag } _write (chunk, encoding, done) { const elem = document.createElement(this._tag) const content = document.createTextNode(chunk.toString()) elem.appendChild(content) this._target.appendChild(elem) done() } } module.exports = DOMAppend @loige 93

Slide 147

Slide 147 text

// dom-append.js (custom writable stream) const { Writable } = require('readable-stream') class DOMAppend extends Writable { constructor (target, tag = 'p', options) { super(options) this._target = target this._tag = tag } _write (chunk, encoding, done) { const elem = document.createElement(this._tag) const content = document.createTextNode(chunk.toString()) elem.appendChild(content) this._target.appendChild(elem) done() } } module.exports = DOMAppend @loige 93

Slide 148

Slide 148 text

// dom-append.js (custom writable stream) const { Writable } = require('readable-stream') class DOMAppend extends Writable { constructor (target, tag = 'p', options) { super(options) this._target = target this._tag = tag } _write (chunk, encoding, done) { const elem = document.createElement(this._tag) const content = document.createTextNode(chunk.toString()) elem.appendChild(content) this._target.appendChild(elem) done() } } module.exports = DOMAppend @loige 93

Slide 149

Slide 149 text

// dom-append.js (custom writable stream) const { Writable } = require('readable-stream') class DOMAppend extends Writable { constructor (target, tag = 'p', options) { super(options) this._target = target this._tag = tag } _write (chunk, encoding, done) { const elem = document.createElement(this._tag) const content = document.createTextNode(chunk.toString()) elem.appendChild(content) this._target.appendChild(elem) done() } } module.exports = DOMAppend @loige 93

Slide 150

Slide 150 text

// dom-append.js (custom writable stream) const { Writable } = require('readable-stream') class DOMAppend extends Writable { constructor (target, tag = 'p', options) { super(options) this._target = target this._tag = tag } _write (chunk, encoding, done) { const elem = document.createElement(this._tag) const content = document.createTextNode(chunk.toString()) elem.appendChild(content) this._target.appendChild(elem) done() } } module.exports = DOMAppend @loige 93

Slide 151

Slide 151 text

05. STREAMS IN THE 05. STREAMS IN THE BROWSER BROWSER @loige 94

Slide 152

Slide 152 text

// browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify') const DOMAppend = require('../dom-append') const list = document.getElementById('list') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend(list, 'li') emoji .pipe(uppercasify) .pipe(append) @loige 95

Slide 153

Slide 153 text

// browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify') const DOMAppend = require('../dom-append') const list = document.getElementById('list') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend(list, 'li') emoji .pipe(uppercasify) .pipe(append) @loige 95

Slide 154

Slide 154 text

// browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify') const DOMAppend = require('../dom-append') const list = document.getElementById('list') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend(list, 'li') emoji .pipe(uppercasify) .pipe(append) @loige 95

Slide 155

Slide 155 text

// browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify') const DOMAppend = require('../dom-append') const list = document.getElementById('list') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend(list, 'li') emoji .pipe(uppercasify) .pipe(append) @loige 95

Slide 156

Slide 156 text

// browser/app.js const EmojiStream = require('../emoji-stream') const Uppercasify = require('../uppercasify') const DOMAppend = require('../dom-append') const list = document.getElementById('list') const emoji = new EmojiStream() const uppercasify = new Uppercasify() const append = new DOMAppend(list, 'li') emoji .pipe(uppercasify) .pipe(append) @loige 95

Slide 157

Slide 157 text

npm i --save-dev webpack webpack-cli node_modules/.bin/webpack src/browser/app.js # creates dist/main.js mv dist/main.js src/browser/app-bundle.js @loige Let's use webpack to build this app for the browser 96

Slide 158

Slide 158 text

Streams in the browser!
    @loige Finally let's create an index.html 97

    Slide 159

    Slide 159 text

    Streams in the browser!
      @loige Finally let's create an index.html 97

      Slide 160

      Slide 160 text

      Streams in the browser!
        @loige Finally let's create an index.html 97

        Slide 161

        Slide 161 text

        @loige 98

        Slide 162

        Slide 162 text

        @loige 98

        Slide 163

        Slide 163 text

        06. CLOSING 06. CLOSING @loige 99

        Slide 164

        Slide 164 text

        Streams have low memory footprint Process data as soon as it's available Composition through pipelines Define your logic as transform streams Readable and writable streams are good abstractions for input & output You can swap them as needed without having to change the logic @loige TLDR; TLDR; 100

        Slide 165

        Slide 165 text

        IF YOU WANT TO LEARN (EVEN) MOAR IF YOU WANT TO LEARN (EVEN) MOAR ABOUT STREAMS... ABOUT STREAMS... nodejs.org/api/stream.html github.com/substack/stream-handbook @loige 101

        Slide 166

        Slide 166 text

        @loige 102

        Slide 167

        Slide 167 text

        CREDITS CREDITS Cover Photo by on for the amazing St. Patrick emoji art The internet for the memes! :D WeRoad Unsplash emojiart.org @loige SPECIAL THANKS SPECIAL THANKS , , , , @mariocasciaro @machine_person @Podgeypoos79 @katavic_d @UrsoLuca THANKS! THANKS! loige.link/streams-dub 103