Real-Time Stream Processing Using Apache Spark 3 for Python Developers

Take your first steps towards discovering, learning, and using Apache Spark 3.0. We will be taking a live coding approach in this carefully structured course and explaining all the core concepts needed along the way.

In this course, we will understand the real-time stream processing concepts, Spark structured streaming APIs, and architecture.

We will work with file streams, Kafka source, and integrating Spark with Kafka. Next, we will learn about state-less and state-full streaming transformations. Then cover windowing aggregates using Spark stream. Next, we will cover watermarking and state cleanup. After that, we will cover streaming joins and aggregation, handling memory problems with streaming joins. Finally, learn to create arbitrary streaming sinks.

By the end of this course, you will be able to create real-time stream processing applications using Apache Spark.

All the resources for the course are available at https://github.com/PacktPublishing/Real-time-stream-processing-using-Ap…

Type
video
Category
publication date
2022-02-25
what you will learn

Explore state-less and state-full streaming transformations
Windowing aggregates using Spark stream
Learn Watermarking and state cleanup
Implement streaming joins and aggregations
Handling memory problems with streaming joins
Learn to create arbitrary streaming sinks

duration
274
key features
Learn real-time stream processing concepts * Understand Spark structured streaming APIs and architecture * Work with file streams, Kafka source, and integrating Spark with Kafka
approach
This course is example-driven and follows a working session-like approach. The course delivers live coding sessions and explains the concepts along the way.
audience
This course is designed for software engineers and architects who are willing to design and develop big data engineering projects using Apache Spark. It is also designed for programmers and developers who are aspiring to grow and learn data engineering using Apache Spark.

For this course, you need to know Spark fundamentals and should be exposed to Spark Dataframe APIs. Also, you should know Kafka fundamentals and have a working knowledge of Apache Kafka. One should also have programming knowledge of Python programming.
meta description
Build your own real-time stream processing applications using Apache Spark 3.x and PySpark
short description
Get to grips with real-time stream processing using PySpark as well as Spark structured streaming and apply that knowledge to build stream processing solutions. This course is example-driven and follows a working session-like approach.
subtitle
Learn to create real-time stream processing applications using Apache Spark
keywords
Apache Spark, Spark 3.0, SQL, PyCharm, Python, Spark Structured API, Data Sources and Sinks, Data Engineering, Data Processing, Data Frames, Spark SQL
Product ISBN
9781803246543