Big data processing with Apache Beam

Big data processing with Apache Beam

In this talk, we present the new Python SDK for Apache Beam - a parallel programming model that allows one to implement batch and streaming data processing jobs that can run on a variety of execution engines like Apache Spark and Google Cloud Dataflow. We will use examples to discuss some of the interesting challenges in providing a Pythonic API and execution environment for distributed processing.

0b40b3c621633157be039d55d0fd9ea0?s=128

Sourabh

July 06, 2017
Tweet

Transcript

  1. 5.

    I am Sourabh Hello! I am a Software Engineer I

    tweet at @sb2nov I like Ice Cream
  2. 22.

    ASKING THE RIGHT QUESTIONS When in processing time? What is

    being computed? Where in event time? How do refinements happen?
  3. 25.

    WHERE IN EVENT TIME? scores: PCollection[KV[str, int]] = (input |

    beam.WindowInto(FixedWindows(2 * 60)) | Sum.integersPerKey())
  4. 28.

    WHEN IN PROCESSING TIME? scores: PCollection[KV[str, int]] = (input |

    beam.WindowInto(FixedWindows(2 * 60) .triggering(AtWatermark())) | Sum.integersPerKey())
  5. 30.

    HOW DO REFINEMENTS HAPPEN? scores: PCollection[KV[str, int]] = (input |

    beam.WindowInto(FixedWindows(2 * 60) .triggering(AtWatermark() .withEarlyFirings(AtPeriod(1 * 60)) .withLateFirings(AtCount(1))) .accumulatingFiredPanes()) | Sum.integersPerKey())
  6. 32.

    CUSTOMIZING WHAT WHERE WHEN HOW Classic Batch Windowed Batch Streaming

    Streaming + Accumulation For more information see https://cloud.google.com/dataflow/examples/gaming-example
  7. 33.
  8. 36.

    WORD COUNT import apache_beam as beam, re with beam.Pipeline() as

    p: (p | beam.io.textio.ReadFromText("input.txt"))
  9. 37.

    WORD COUNT import apache_beam as beam, re with beam.Pipeline() as

    p: (p | beam.io.textio.ReadFromText("input.txt") | beam.FlatMap(lamdba s: re.split("\\W+", s)))
  10. 38.

    WORD COUNT import apache_beam as beam, re with beam.Pipeline() as

    p: (p | beam.io.textio.ReadFromText("input.txt") | beam.FlatMap(lamdba s: re.split("\\W+", s)) | beam.combiners.Count.PerElement())
  11. 39.

    WORD COUNT import apache_beam as beam, re with beam.Pipeline() as

    p: (p | beam.io.textio.ReadFromText("input.txt") | beam.FlatMap(lamdba s: re.split("\\W+", s)) | beam.combiners.Count.PerElement() | beam.Map(lambda (w, c): "%s: %d" % (w, c)))
  12. 40.

    WORD COUNT import apache_beam as beam, re with beam.Pipeline() as

    p: (p | beam.io.textio.ReadFromText("input.txt") | beam.FlatMap(lamdba s: re.split("\\W+", s)) | beam.combiners.Count.PerElement() | beam.Map(lambda (w, c): "%s: %d" % (w, c)) | beam.io.textio.WriteToText("output/stringcounts"))
  13. 44.

    TRENDING ON TWITTER with beam.Pipeline() as p: (p | beam.io.ReadStringsFromPubSub("twitter_topic")

    | beam.WindowInto(SlidingWindows(5*60, 1*60)) | beam.ParDo(ParseHashTagDoFn()))
  14. 45.

    TRENDING ON TWITTER with beam.Pipeline() as p: (p | beam.io.ReadStringsFromPubSub("twitter_topic")

    | beam.WindowInto(SlidingWindows(5*60, 1*60)) | beam.ParDo(ParseHashTagDoFn()) | beam.combiners.Count.PerElement())
  15. 46.

    TRENDING ON TWITTER with beam.Pipeline() as p: (p | beam.io.ReadStringsFromPubSub("twitter_topic")

    | beam.WindowInto(SlidingWindows(5*60, 1*60)) | beam.ParDo(ParseHashTagDoFn()) | beam.combiners.Count.PerElement() | beam.ParDo(BigQueryOutputFormatDoFn()) | beam.io.WriteToBigQuery("trends_table"))
  16. 48.

    Other Languages Beam Java Beam Python Pipeline SDK User facing

    SDK, defines a language specific API for the end user to specify the pipeline computation DAG.
  17. 49.

    Runner API Other Languages Beam Java Beam Python Runner API

    Runner and language agnostic representation of the user’s pipeline graph. It only contains nodes of Beam model primitives that all runners understand to maintain portability across runners.
  18. 50.

    Runner API Other Languages Beam Java Beam Python Execution Execution

    Execution SDK Harness Docker based execution environments that are shared by all runners for running the user code in a consistent environment.
  19. 51.

    Fn API Runner API Other Languages Beam Java Beam Python

    Execution Execution Execution Fn API API which the execution environments use to send and receive data, report metrics around execution of the user code with the Runner.
  20. 52.

    Fn API Apache Flink Apache Spark Runner API Other Languages

    Beam Java Beam Python Execution Execution Cloud Dataflow Execution Apache Gear- pump Apache Apex Runner Distributed processing environments that understand the runner API graph and how to execute the Beam model primitives.
  21. 53.

    More Beam? Issue tracker (https://issues.apache.org/jira/projects/BEAM) Beam website (https://beam.apache.org/) Source code

    (https://github.com/apache/beam) Developers mailing list (dev-subscribe@beam.apache.org) Users mailing list (user-subscribe@beam.apache.org)