Tags
Language
Tags
April 2025
Su Mo Tu We Th Fr Sa
30 31 1 2 3 4 5
6 7 8 9 10 11 12
13 14 15 16 17 18 19
20 21 22 23 24 25 26
27 28 29 30 1 2 3
Attention❗ To save your time, in order to download anything on this site, you must be registered 👉 HERE. If you do not have a registration yet, it is better to do it right away. ✌

( • )( • ) ( ͡⚆ ͜ʖ ͡⚆ ) (‿ˠ‿)
SpicyMags.xyz

Hands-On Kafka Connect: Source To Sink In S3, Gcs & Beyond

Posted By: ELK1nG
Hands-On Kafka Connect: Source To Sink In S3, Gcs & Beyond

Hands-On Kafka Connect: Source To Sink In S3, Gcs & Beyond
Published 8/2024
MP4 | Video: h264, 1920x1080 | Audio: AAC, 44.1 KHz
Language: English | Size: 1.41 GB | Duration: 2h 52m

Master Kafka Connect with Hands-On experience: S3 Sink, Debezium MySQL CDC Source Connectors, and Connect Cluster Setup

What you'll learn

In depth knowledge on Kafka connect and it's Architecture

In depth practical knowledge on running S3 sink connector in distributed mode

Setting up Kafka Connect cluster

Complete understanding on Debezium Mysql CDC Source connector

Schema registry necessity and integrating it with sink and source connectors

Schema Evolution in sink and source connectors

Requirements

Apache Kafka understanding

Docker and Docker compose

Description

This course is a completely dedicated to Kafka Connect and exploring its open sourced connectors. There are plenty of connectors available in Kafka Connect.  To begin with, I have added a sink connector and a source connector to this course. We start this course by learning what is Kafka connect and its architecture.  In the 2nd module, we learn S3 Sink connector in detail. At first, we learn what is s3 sink connector and we install it using Standalone mode. Next, we run the same configurations using distributed mode so that you get the clear difference between them. We explore below Partitioner Class with examplesDefault PartitionerTime Based PartitionerField PartitionerAfter  that, we learn how to integration Kafka connect with Schema Registry and test the schema evolution in BACKWARD compatibility mode.Next, we learn what is DLQ and test it by generating invalid records to Kafka. Lastly, We automate, creating s3 sink connector  using a single command with the help of Docker composer.Module 3 is dedicated to setting up a Kafka connect cluster.Here, we provision 2 machine from AWS and start s3 sink connector worker process in both machines. We thoroughly test the Load Balancing and Fault Tolerance behaviour of our Kafka connect cluster.In Module 4, we explore a popular source connector. That is Debezium Mysql CDC Source connector.Here, At first, we learn how Debezium CDC connector works internally. Then we start our Debezium mysql connector in distributed mode using docker commands. After that, we run DML statements like insert, update and delete queries and learn the respective event schema changes. Similarly, we run DDL statements like dropping a table etc and observe how schema history Kafka topic capture the event changes. Lastly, we integrate it with Schema Registry and test the setup by running DDL & DML statement.

Overview

Section 1: Overview

Lecture 1 Course Introduction

Lecture 2 Kafka Connect Architecture

Section 2: Amazon S3 Sink Connector

Lecture 3 What is S3 Sink Connector ?

Lecture 4 Prerequisites

Lecture 5 Standalone (JSON)

Lecture 6 Distributed (JSON)

Lecture 7 Time Based Partitioner

Lecture 8 Field Partitioner [Avro]

Lecture 9 Schema Evolution [Schema Registry]

Lecture 10 Kafka Connect using Docker

Lecture 11 Dead Letter Queues (DLQ)

Lecture 12 Automate using Docker Compose

Section 3: Kafka Connect Cluster

Lecture 13 Setting up Connect Cluster

Lecture 14 Distributed & Fault Tolerance

Section 4: Debezium MySQL CDC Source Connector

Lecture 15 Introduction

Lecture 16 Architecture

Lecture 17 Mysql Setup

Lecture 18 Starting Debezium Mysql Connector

Lecture 19 DML - INSERT, UPDATE, DELETE

Lecture 20 DDL - ALTER, DROP

Lecture 21 Schema Registry

Data Engineers,Software Engineers