Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Quarkusで作るInteractive Stream Application

Quarkusで作るInteractive Stream Application

JJUG CCC Fall 2025の登壇資料です。
QuarkusとKafka Streamsを組み合わせる活用法の解説と事例の紹介です。

Avatar for Tomohiro Hashidate

Tomohiro Hashidate

November 15, 2025
Tweet

More Decks by Tomohiro Hashidate

Other Decks in Technology

Transcript

  1. RDBやKVSだけでは扱いにくいデータやサービスも存 在する Sketch (確率的データ構造) CRDT (Conflict-free Replicated Data Type) Apache

    Arrowの様なオンメモリ向けデータ構造 Database側で独自の拡張を持っているケースはあるが用途に合うかはケースバイケース。 9
  2. サンプルコード (from Official Document) Properties props = new Properties(); String

    rpcEndpoint = "host1:4460"; props.put(StreamsConfig.APPLICATION_SERVER_CONFIG, rpcEndpoint); // rpcのエンドポイントをconfigに登録する // ... further settings may follow here ... StreamsConfig config = new StreamsConfig(props); StreamsBuilder builder = new StreamsBuilder(); KStream<String, String> textLines = builder.stream(stringSerde, stringSerde, "word-count-input"); // ... 省略 ... KafkaStreams streams = new KafkaStreams(builder, streamsConfiguration); streams.start(); // Kafka Streamsと一緒にRPCサービスも起動する MyRPCService rpcService = ...; rpcService.listenAt(rpcEndpoint); 13
  3. データ読み出しのサンプルコード KafkaStreams streams = ... // Get the key-value store

    CountsKeyValueStore ReadOnlyKeyValueStore<String, Long> keyValueStore = streams.store("CountsKeyValueStore", QueryableStoreTypes.keyValueStore()); // Get value by key System.out.println("count for hello:" + keyValueStore.get("hello")); // Get the values for a range of keys available in this application instance KeyValueIterator<String, Long> range = keyValueStore.range("all", "streams"); while (range.hasNext()) { KeyValue<String, Long> next = range.next(); System.out.println("count for " + next.key + ": " + next.value); } 14
  4. リモートノードに対してリクエストするサンプルコー ド KafkaStreams streams = ...; // Find all the

    locations of local instances of the state store named "word-count" Collection<StreamsMetadata> wordCountHosts = streams.allMetadataForStore("word-count"); // For illustrative purposes, we assume using an HTTP client to talk to remote app instances. HttpClient http = ...; // Get the word count for word (aka key) 'alice': Approach 1 // // We first find the one app instance that manages the count for 'alice' in its local state stores. StreamsMetadata metadata = streams.metadataForKey("word-count", "alice", Serdes.String().serializer()); // Then, we query only that single app instance for the latest count of 'alice'. // Note: The RPC URL shown below is fictitious and only serves to illustrate the idea. Ultimately, // the URL (or, in general, the method of communication) will depend on the RPC layer you opted to // implement. Again, we provide end-to-end demo applications (such as KafkaMusicExample) that showcase // how to implement such an RPC layer. Long result = http.getLong("http://" + metadata.host() + ":" + metadata.port() + "/word-count/alice"); 15
  5. // Get the word count for word (aka key) 'alice':

    Approach 2 // // Alternatively, we could also choose (say) a brute-force approach where we query every app instance // until we find the one that happens to know about 'alice'. Optional<Long> result = streams.allMetadataForStore("word-count") .stream() .map(streamsMetadata -> { // Construct the (fictituous) full endpoint URL to query the current remote application instance String url = "http://" + streamsMetadata.host() + ":" + streamsMetadata.port() + "/word-count/alice"; // Read and return the count for 'alice', if any. return http.getLong(url); }) .filter(s -> s != null) .findFirst(); 16
  6. Kafka Streams Extensionの基本的な仕組み QuarkusのDIの仕組みの中でKafka Streamsオブジェクトを生成する。 @Singleton public class KafkaStreamsProducer {

    // ...省略... @Inject public KafkaStreamsProducer(KafkaStreamsSupport kafkaStreamsSupport, KafkaStreamsRuntimeConfig runtimeConfig, ExecutorService executorService, Instance<Topology> topology, Instance<KafkaClientSupplier> kafkaClientSupplier, @Identifier("default-kafka-broker") Instance<Map<String, Object>> defaultConfiguration, Instance<StateListener> stateListener, Instance<StateRestoreListener> globalStateRestoreListener, Instance<StreamsUncaughtExceptionHandler> uncaughtExceptionHandlerListener) { // ... 省略 ... this.executorService = executorService; this.streamsConfig = new StreamsConfig(kafkaStreamsProperties); this.kafkaStreams = initializeKafkaStreams(streamsConfig, topology.get(), kafkaClientSupplier, stateListener, globalStateRestoreListener, uncaughtExceptionHandlerListener); this.topologyManager = new KafkaStreamsTopologyManager(kafkaAdminClient, topology.get(), runtimeConfig); } 21
  7. onStartupフックでアプリケーションを起動する public void onStartup(@Observes StartupEvent event, Event<KafkaStreams> kafkaStreamsEvent) { if

    (kafkaStreams != null) { kafkaStreamsEvent.fire(kafkaStreams); executorService.execute(() -> { try { topologyManager.waitForTopicsToBeCreated(); } catch (InterruptedException e) { Thread.currentThread().interrupt(); return; } if (!topologyManager.isClosed()) { LOGGER.debug("Starting Kafka Streams pipeline"); kafkaStreams.start(); } }); } } その他、アプリケーション操作に必要な各オブジェクトをInjectionできる様にProduceメソ ッドが定義されている。 22
  8. Kafka Streamsアプリケーションの実装 @ApplicationScoped public class WordCountTopology { public static final

    String WORD_COUNT_STORE = "word-count-store"; public static final String TEXTS_TOPIC = "Texts"; @Produces public Topology buildTopology() { StreamsBuilder builder = new StreamsBuilder(); builder.stream(TEXTS_TOPIC, Consumed.with(Serdes.String(), Serdes.String())) .flatMapValues(value -> Arrays.asList(value.toLowerCase().split("\\s+"))) .groupBy((key, word) -> word, Grouped.with(Serdes.String(), Serdes.String())) .count( Materialized .<String, Long, KeyValueStore<org.apache.kafka.common.utils.Bytes, byte[]>>as( WORD_COUNT_STORE) .withKeySerde(Serdes.String()) .withValueSerde(Serdes.Long())); return builder.build(); } } 25
  9. RESTエントリポイントの実装 @Path("/word-count-result") @ApplicationScoped public class WordCountResource { @Inject KafkaStreams kafkaStreams;

    @GET @Produces(MediaType.APPLICATION_JSON) public WordCountResult getWordCountResult() { ReadOnlyKeyValueStore<String, Long> store = kafkaStreams.store( StoreQueryParameters.fromNameAndType( WordCountTopology.WORD_COUNT_STORE, QueryableStoreTypes.keyValueStore())); List<WordCount> results = new ArrayList<>(); try (KeyValueIterator<String, Long> iterator = store.all()) { while (iterator.hasNext()) { var entry = iterator.next(); results.add(new WordCount(entry.key, entry.value)); } } results.sort(Comparator.comparing(WordCount::count).reversed()); return new WordCountResult(results); } public record WordCount(String word, Long count) {} public record WordCountResult(List<WordCount> results) {} 26