Welcome to the Kafka Streams API video course, where you will dive deep into building powerful Kafka Streams applications. In the first section, you will start by introducing the fundamental concepts and terminologies associated with Kafka Streams development. You will then move on to building a simple Kafka Streams app and testing it locally to gain hands-on experience.
Next, you will explore the various operators available in the Kafka Streams API, gaining a solid understanding of how they contribute to building robust streaming applications. You will also delve into the serialization and deserialization process, learning the best approach to creating a generic serializer and deserializer that can be utilized for any type of message.
Moving forward, you will take on the exciting task of implementing an order management system for a retail company using Kafka Streams. You will explore error handling mechanisms, KTable and GlobalKTable concepts, and dive into stateful operators and aggregation-related functionalities. Additionally, you will learn about the importance of rekeying records and the use of joins in your application.
Continuing your journey, you will learn about writing automated tests for Kafka Streams apps, including unit tests and integration tests using Embedded Kafka. Additionally, you will explore the concept of a grace period and its application in Kafka Streams.
Finally, you will learn how to package your Kafka Streams app as an executable and launch it effectively.
By the end of this course, you will have a comprehensive understanding of the Kafka Streams API, enabling you to build a wide range of applications using this powerful tool.
Build advanced Kafka Streams applications using Streams API
Build Kafka Streams application using high-level DSL
Test Kafka Streams using TopologyTestDriver using JUnit5
Test Spring Kafka Streams using EmbeddedKafka and JUnit5
Aggregate multiple events into aggregated events
Learn to join multiple streams into one joined stream
Prerequisites for this course include a solid foundation in Java programming and prior experience in building Kafka applications. Familiarity with IntelliJ or any other IDE is recommended. Additionally, a working knowledge of Java 17 and understanding of Gradle or Maven is necessary.