Mariano Anaya
January 17, 2018
490

Exploring Generators & Coroutines

January 17, 2018

Transcript

2. \$ history • ~2001: PEP-255 - “Simple Generators” [Py 2.2]

• ~2005: PEP-342 - “Coroutines via Enhanced Generators” [Py 2.5] • ~2009: PEP-380 - “Syntax for Delegating to a Subgenerator” [Py 3.3] • ~2016: PEP-525 - “Asynchronous Generators” [Py 3.6]
3. Simple Generators: Basic Idea Generate elements, one at the time

→ lazy computation • Save memory • Support iteration pattern, infinite sequences, etc.
4. Simple Generators • The yield statement, makes the function a

generator. • .next() will advance until the next yield statement is reached. • yield produces a value to the caller, and suspends the execution.
5. Simple Generators: Example 1 “Sum all numbers up to N”.

LIMIT = 1_000_000 def old_range(n): numbers = [] i = 0 while i < n: numbers.append(i) i += 1 return numbers
6. Simple Generators: Example 2 def new_range(n): i = 0 while

i < n: yield i i += 1
7. Simple Generators: iteration • Given a generator g = generator_function()

◦ next(g), will advance to the next yield statement. ◦ If there are no more elements, StopIteration is raised • The idiom “for x in g:...” follows this protocol ◦ Calls iter(g) → __iter__, __next__
8. Iteration >>> g = generator() >>> next(g) 1 >>> next(g)

2 >>> next(g) Traceback (most recent call last): File "<stdin>", line 1, in ? StopIteration >>>
9. next() & StopIteration >>> g = gen() >>> next(g) Traceback

(most recent call last): File "<stdin>", line 1, in ? StopIteration >>> next(g, "default value") 'default value'
10. Use Generators def new_range(n): i = 0 while i <

n: yield i i += 1 total = sum(new_range(LIMIT)) total = 0 i = 0 while i < LIMIT: total += i i += 1 Alternative Pythonic way A generator gives a proper abstraction, with all the advantages of iterators for free (combine with itertools, chain, assign, pass along, etc.).

12. Coroutines via Enhanced Generators • How about sending (receiving) data

to (from) a generator? • And exceptions? <g>.send(<value>) <g>.throw(<exception>) <g>.close()
13. • Coroutines are syntactically like generators • With .send(), the

caller pushes data into the coroutine. ◦ yield usually appears on the RHS value = yield result • The coroutine is suspended at the yield Coroutines via Enhanced Generators
14. >>> c = coro() >>> next(c) >>> step = c.send(received)

Coroutines via Enhanced Generators def coro(): step = 0 while True: received = yield step step += 1 print("Received: ", received)
15. Advance the Generator Before sending any value to the generator,

this has to be advanced with: next(coroutine) | coroutine.send(None) If not, TypeError is raised.
16. Coroutines via Enhanced Generators generator.throw(exc_type[, exc_value[, ex_tb]]) • Raises the

exception at the point where the coroutine is suspended. • Equivalent to: ◦ raise exc_type, exc_value, ex_tb

18. Delegating to a Sub-Generator • Generators can now return values!

• yield from ◦ Gets all values from an iterable object ◦ Produce the values from the sub-generator ◦ Open a channel to the internal generator ◦ Can get the value returned by the internal generator
19. Generators as Coroutines: return values StopIteration.value contains the result. →

Once the return is reached, there is no more iteration. >>> def gen(): ...: yield 1 ...: yield 2 ...: return 42 ...: >>> g = gen() >>> next(g) 1 >>> next(g) 2 >>> next(g) ------------------------------------ StopIteration Traceback (most recent call last) StopIteration: 42
20. yield from • Get all the elements from an iterable

• Delegate to a sub-generator
21. yield from: extract values Basic usage: yield from x Could

be thought of as: for e in x: yield e
22. yield from: Basics Similar to itertools.chain: >>> def chain2(*iterables): ...

for it in iterables: ... yield from it >>> list(chain2([1,2,3], (4, 5, 6), "hello")) [1, 2, 3, 4, 5, 6, 'h', 'e', 'l', 'l', 'o']
23. yield from: More Communicate with the internal generators • .send(),

and .throw() are passed along. • Returned (yielded) values, bubble up. yield from acts as a “channel” from the original caller, to the internal generators.
24. Example: yield from def internal(name, limit): for i in range(limit):

value = yield i print(f"{name} got: {value}") def general(): yield from internal("first", 10) yield from internal("second", 20)
25. yield from: recap • Allows delegating to a sub-generator •

Enables chaining generators and many iterables together • Makes it easier to refactor generators

27. asyncio • On asyncio, the event loop drives the coroutines

scheduled to run, and updates them with .send(), next(), .throw(), etc. • The coroutine we write, should only delegate with await (yield from), to some other 3rd party generator, that will do the actual I/O. • yield , yield from, await give the control back to the scheduler.
28. yield from & await # py 3.4 @asyncio.coroutine def coroutine():

yield from asyncio.sleep(1) # py 3.5+ async def coroutine(): await asyncio.sleep(1)
29. await Works like yield from, except that: • Does not

accept generators that aren’t coroutines • Accepts awaitable objects: __await__()
30. yield from vs. await Coroutine Generator Awaitable yield from Yes

Yes No await Yes No Yes
31. None
32. Asynchronous Generators • Before Python 3.6, it was not possible

to have a yield in a coroutine :-( ◦ “async def” only allowed “return” or “await” • “Produce elements, one at the time, asynchronously”: ◦ async for x in data_producer: ... ◦ Asynchronous iterables, were required • Difference between iterator (__iter__ / __next__), vs. a generator ◦ But for asynchronous code (__aiter__ / __anext__), vs. async generator
33. Summary • Thinking in terms of iterables, allows to build

better patterns. • Generators and coroutines are conceptually different, however their implementation details are similar. • yield from is a construction that allows more powerful coroutines. • Any yield from chain of calls ends with a yield (at the end, there is a generator).