explained in details, and are all but the command handler only written in Clojure. We use the schema registry for schema management, also making the messages smaller. All messages will have string as key and avro as value type. Meaning of the colors: • Orange are the Confluent platform parts • Yellow are the parts of open bank mark • Green is an NginX instance • Light blue are PostgreSQL databases
One way to set the logging for all components. • All components have knowledge over the topics and data types without needing to connect. • Will generate Avro object for (de)serialization. • Functions wrapping the (Java) Kafka Consumer and Producer. • Functions for dealing with IBAN and UUID.
set. Checks if it’s possible to set the replication factor to what’s in the config, takes the minimum of the available nodes and the config. Note that this has been used only on a clean Kafka cluster, and there is currently no check for topic properties being correct.
simple message with just a long value. Exposes and nrepl. A nrepl is a network repl, which can be used to execute code remotely and get the result back. This is a powerful concept, making it possible to apply fixes as the code runs, or interactively solve bugs. With the nrepl the pace of the send messages can be changed.
each received heartbeat. This will be ConfirmAccountCreation first, as it runs there will be less of these. It rondomly ceates different kinds of ConfirmMoneyTransfer which might fail because it would cause the balance to become below the limit.
generates a new iban, if it not already exists creates a balance using the default values, if it does exists gives back an AccountCreationFailed. • ConfirmMoneyTransfer: if the supplied token is correct, and there is enough money, makes the transfer. Updates both to and form if they are ‘open-bank’ ibans. And creates a BalanceChanged event for each changed balance.
issue commands and get the results back in the frontend. All services have there own consumer, and share the producer and the database. • Transaction service: makes it possible to query or subscribe to balance changed events. • Account creation service: used to create an account. Will link the username used to log in with the uuid send for the account creation, in order to get the same iban back should the user log in at another time. • Money transfer service: tries to transfer money, and provides feedback.
do several kind of transaction that either increase or lower the money on the balance in such a way as much goes in as goes out after 10 runs. It measures the time till the new balance comes in. During the test the load of the system is increased by using the nrepl of the heartbeat. Increasing the number of heartbeats which in turn will trigger additional commands to be processed. Also during the test using lispyclouds/clj-docker-client both the cpu and memory of parts of the system are measured. Al the data is written into a file so it can be analyzed later on.
other files to generate graphs. All the data is combined, and for each point with the same load some statistics are calculated. Most often the mean and the standard error. For different values graphs are generated in the public folder for the frontend so they can be easily viewed. They are available at the background tab at open-bank.
Benevolent Dictator for Life. • Runs on the jvm, and has interop with Java. • Cognitect is the company behind Clojure, it has several product around Clojure, like Datomic an elastic scaling transactional database. • Multiple recent libraries, besides the consumer and the producer sometimes also supporting streams, the admin client and avro. • At the time I started the project the latest Clojure was still java 6 compatible, and there was no recent Clojure Kafka client. • Some fuss with Jackson in combination with other libraries, using explicit Jackson versions to make it work.
to the IntelliJ IDEA. • Can change java code to Kotlin automatically. • Bit more functional then Java, and often immutable defaults. • Spring makes it easy to set up and have something working fast. • Getting Avro serializers to work was a puzzle, getting the right properties to use Avro serialisation. • With Spring Cloud Streams is using Kafka Streams Api under the hood. • Easiest it to start on Spring Initializr. • Make sure to use the kotlin-maven-allopen and kotlin-maven-noarg plugin to compile.
safety and speed. • Mozilla was the first investor for Rust and continues to sponsor the work of the open source project. • Used by dropbox in production. • Two libraries, one that recently is getting more active, bumped to 1.0.0 of librdkafka, another one using pure rust, but has little activity and little features. • No support for avro when I started. • Created library to use the schema registry to transform bytes to Value and the other way around, and also to set a schema in the schema registry. • Library is more low level than Java, things like logging have to be setup. Some examples are available making it easy.
simple and bare docker image • Rust-rdkafka only needs slightly more. • Clojure is pretty close to rust, after jit has kicked in. • Kotlin jit seems effective about the same but more overhead because of Spring.
other options for the JVM with GraalVM like Quarkus or Micronaut. • Memory footprint matters. • A small Docker image is important. • Memory safety is important. But: • Be sure to test if in your case the broker can keep up. • What the application needs to do can be done with Rust. • Development may take a bit longer.