Upgrade to Pro — share decks privately, control downloads, hide ads and more …

3 Event-Driven Architecture Patterns in Action @ Red Hat Developers DevNation Tech Talk - 03/23

3 Event-Driven Architecture Patterns in Action @ Red Hat Developers DevNation Tech Talk - 03/23

Abstract:

The shift from monolithic applications to microservice architectures is anything but easy. Since services usually don't operate in isolation, it's vital to implement proper communication models. In order to avoid tight coupling and numerous point-to-point connections between any two services, an effective approach is to introduce an event streaming platform in between. Doing so allows any services to publish and subscribe to events without directly communicating with one another. During this tech talk, we explore three, event-driven, architecture patterns commonly found in the field: the claim-check pattern, the content enricher pattern, and the message translator pattern. For each of these patterns, we'll run a concrete sample scenario built on top of Apache Kafka and discuss some variations and trade-offs along the way.

Recording: https://www.youtube.com/watch?v=6fBP1Ncjva8

Repository: https://github.com/hpgrahsl/eda-pattern-examples

Hans-Peter Grahsl

March 30, 2023
Tweet

More Decks by Hans-Peter Grahsl

Other Decks in Programming

Transcript

  1. What we’ll be discussing today AGENDA 2 EDA and Events

    Claim Check Pattern Content Enricher Pattern Message Translator Pattern
  2. OpenShift Developer Sandbox! Get free access for renewable 30 days

    to a self-service, cloud-hosted Kubernetes experience with Developer Sandbox for Red Hat OpenShift. red.ht/sandbox4all
  3. 🤔 Event-Driven Architecture (EDA) ? EDA Intro 5 Event-Driven Architecture

    (EDA) is a way of designing loosely coupled applications and services that respond in near real-time based on the sending and receiving of events.
  4. Events and Event Design Events … or “something has happened

    (somewhere)” 9 • order created (order service) • order created with id ‘123’ (order service) • order created with id ‘123’, and N products (order service) • order created with id ‘123’, N products, and total amount of M (order service) • … • order created with id ‘123’, product A, product B,... , total amount of M, , customer with id ‘XYZ’, and address ‘foo blah’, … (order service)
  5. Events and Event Design 10 🤔 Where does this end?

    🤔 🛑 When should this stop? 🛑
  6. Events and Event Design 12 One seeks bullets of silver

    … … but only finds trade-offs.
  7. Challenge Claim Check Pattern 14 • (unnaturally / too) heavy

    payloads ? raw / binary files graphics, documents, audio, video, … embedded data (base64) within structured text formats (xml, json,…)
  8. Claim Check Pattern Claim Check Pattern 15 • offload large

    (binary) payload into separate storage system • replace actual payload with reference to claim check • use claim check to fetch data when needed at consumer side
  9. Demo Scenario Claim Check Pattern 16 • context: media processing

    pipeline • images sent between services • communication via Kafka topics • external storage system MinIO
  10. Implementation Details Claim Check Pattern 23 • Kafka producer/consumer apps

    written with Quarkus • offloading binary data delegated to specific Serde • MinIO (S3-compatible) acts as storage system → reasonably-sized payloads in order not to pollute event streaming platform with tons of binary data
  11. Discussion Claim Check Pattern 25 • Serde for configurable data

    offloading with ease • in Kafka Connect context → apply Converter • think about object storage id generation choices • currently full payload only → no field-level scope • to some degree it’s a dual-writes problem
  12. Challenge Content Enricher Pattern 27 • (unnaturally / too) light

    payloads ? payloads lacking additional details certain consumers need to know more e.g. due to very compact notification events
  13. Content Enricher Pattern Content Enricher Pattern 28 • add data

    to original payload • perform lookups or do joins against reference data • take reference data to augment original payload with additional information
  14. Demo Scenario Content Enricher Pattern 29 • context: IoT sensor

    data processing • raw stream contains sensor measurements only • certain consumers need device meta-data • sensor stream is enriched with missing data
  15. Implementation Details Content Enricher Pattern 36 • device data resides

    in MySQL table • Kafka data ingestion by means of CDC → Debezium • Quarkus producer app simulates random sensor data • KStreams app joins sensor data with device data • enriched data continuously written to separate topic → consumers can access and directly work with enriched data
  16. Discussion Content Enricher Pattern 38 • augmentation possible with different

    tech stacks • stream processing lib/framework recommended • alternatively SQL abstractions might be beneficial • enrichment kept separate from consumer • reference data local in/close to processing app
  17. Challenge Message Translator Pattern 40 inconvenient / incompatible payloads ?

    • minimize or at least reduce the contract-related coupling • examples: ◦ integrate legacy systems ◦ work with proprietary data formats ◦ prevent leakage of domain models
  18. Message Translator Pattern Message Translator Pattern 41 • perform a

    wide range of payload modifications ◦ basic field type conversions ◦ rename, redact, or mask fields ◦ complete format/protocol change ◦ … • entity modifications between bounded contexts • ideally “self-contained” operations only
  19. Demo Scenario Message Translator Pattern • context: point-of-sales record processing

    • files (source) with data in proprietary CSV format • web service (sink) expecting specific JSON format • streaming data pipeline with in-flight transformations 42
  20. Implementation Details Message Translator Pattern 49 • streaming data pipeline

    with Kafka Connect ◦ file source connector processing CSV files ◦ http sink connector feeding JSON into web API • message translation with Kafka Connect SMT chain ◦ drop and rename fields ◦ field type conversions → enabler for data integration between disparate systems
  21. Discussion Message Translator Pattern 51 • based on Kafka Connect

    → config only • supports many different data sources / sinks • various on-the-fly transformations → SMT chains • translations on either/both sides of the data flow • no direct support for stateful processing
  22. Summary of Observations in EDA EDA Summary 52 • looser

    coupling, better extensibility, higher independence • good event design needs collaboration + domain expertise • no silver bullets for event design → trade-offs • patterns help to deal with implications / to cope with trade-offs
  23. linkedin.com/company/red-hat youtube.com/user/RedHatVideos facebook.com/redhatinc twitter.com/RedHat Red Hat is the world’s leading

    provider of enterprise open source software solutions. Award-winning support, training, and consulting services make Red Hat a trusted adviser to the Fortune 500. Thank you ! 55
  24. Photo Credits in order of appearance 56 https://unsplash.com/photos/DRzYMtae-vA https://pixabay.com/illustrations/night-sky-stars-moon-wolf-garou-4726445/ https://unsplash.com/photos/HTIduwcMMfQ

    https://unsplash.com/photos/w9rYOScXf6E https://unsplash.com/photos/iar-afB0QQw https://unsplash.com/photos/q65bNe9fW-w https://pixabay.com/photos/video-production-video-movie-film-4223885/ https://pixabay.com/photos/feather-bird-plumage-flying-4380861/ https://unsplash.com/photos/MceA9kSze0U https://pixabay.com/photos/travel-adapter-plug-charging-plug-3320763/ https://unsplash.com/photos/g97bviyFhvQ