Kafka for Developers - Data Contracts Using Schema Registry.

The course begins with an introduction that provides an overview of what to expect from it. We will cover the relationship between serialization and Kafka, and the benefits it provides to the overall Kafka architecture. You will gain an understanding of the different serialization formats and the support for Schema in AVRO, Protobuf, and Thrift. You will be introduced to AVRO and why AVRO is popular to work with Kafka and Schema Registry.

Further in this course, we will set up Kafka in local and produce and consume messages using Kafka Console Producer and Consumer. You will set up the base project for the greeting app, which you can use to generate Java classes from the greetings schema using the Gradle build tool. You will also understand how to set up the base project for the greeting app, which we can use to generate Java classes from the greetings schema using the Maven build tool.

You will understand the different techniques of evolving a Schema with the changing business requirements. In further sections, you will code and build a Spring Boot Kafka application that exchanges the data in an AVRO format and interacts with Schema Registry for data evolution. You will also build a RESTful service to publish the events where we receive events through the REST interface and then publish them to Kafka.

By the end of this course, you will have a complete understanding of how to use AVRO as a data serialization format and help you understand the evolution of data using Schema Registry.

All resources and code files are placed here: https://github.com/PacktPublishing/Kafka-for-Developers---Data-Contract…

Type
video
Category
publication date
2023-03-16
what you will learn

Understand the fundamentals of data serialization
Understand the different serialization formats available
Consume AVRO records using Kafka Producer
Publish AVRO records using Kafka Producer
Enforce data contracts using Schema Registry
Use Schema Registry to register the AVRO Schema

duration
332
key features
Introduction to AVRO and its advantages of using them for sharing messages between applications * Learn how Kafka Producer and Consumer interact with the Schema Registry * Build Spring Boot Kafka Producer and Consumer applications that use AVRO as a serialization format
approach
This is a pure hands-on oriented course where you will be learning the concepts through code. This course is structured to give you theoretical and coding experience of building Kafka applications using AVRO and Schema Registry.
audience
This course is suitable for experienced Java developers and developers interested in learning AVRO and how to exchange data between applications using AVRO and Kafka.

This can also be opted for by developers who are interested in learning about Schema Registry and how it fits into Kafka and those developers who are interested in learning techniques to evolve the data. Prior understanding of Java and experience building Kafka Producer is a must to take this course.
meta description
Learn to build Kafka applications using AVRO and Schema Registry with the help of this course
short description
This course is a mix of theory and coding to give you experience in building Kafka applications using AVRO and Schema Registry. You will code and build a coffee order service using Spring Boot and Schema Registry. Anyone interested in learning about Schema Registry and how to build Kafka Producer and Consumer applications that interact with Schema Registry can take up the course.
subtitle
Build a Kafka Producer/Consumer application that uses AVRO data format and Confluent Schema Registry
keywords
Kafka, AVRO, Schema registry, serialization, producer, consumer, data contracts, serialization format, Java, Java classes
Product ISBN
9781837633487