Confluent Developer Skills for Building Apache Kafka®

Prerequisites

Attendees should've a working knowledge of the Kafka architecture, either through:

  • Prior experience, or
  • By completing training beforehand to ensure familiarity with the relevant concepts. Visit www.confluent.io/training to learn the fundamentals of data streaming & Apache Kafka.

 

It's also important to have strong knowledge of Linux/Unix & understand basic TCP/IP networking concepts. Familiarity with Java Virtual Machine (JVM) is helpful.

Professionals are needed to provide a laptop/computer with unobstructed internet access to participate in the class.

To evaluate your Kafka knowledge for this training, please complete the self-assessment: https://cnfl.io/fundamentals-quiz

Audience Profile

Application developers & architects who want to write applications that interact with Apache Kafka. The training treats Java as a first-class citizen, but professionals will derive value even if Java is not their primary programming language. C# & Python clients will also be used in some options for labs.

Learning Objectives

The lessons & activities in this training enable professionals to build the skills to:

  • Write Producers & Consumers to send data to & read data from Kafka
  • Integrate Kafka with external systems using Kafka Connect
  • Write streaming applications with Kafka Streams & ksqlDB
  • Integrate a Kafka client application with Confluent Cloud

Content Outline

  • Write code to connect to a Kafka cluster
  • Distinguish between leaders & followers & work with replicas
  • Explain what a segment is & explore retention
  • Use the CLI to work with topics, producers, & consumers
  • Describe the work a producer performs, & the core components needed to produce messages
  • Create producers & specify configuration properties
  • Explain how to configure producers to know that Kafka receives messages
  • Delve into how batching works & explore batching configurations
  • Explore reacting to failed delivery & tuning producers with timeouts
  • Use the APIs for Java, C#/.NET, or Python to create a Producer
  • Create & manage consumers & their property files
  • Illustrate how consumer groups & partitions provide scalability & fault tolerance
  • Explore managing consumer offsets
  • Tune fetch requests
  • Explain how consumer groups are managed & their benefits
  • Compare & contrast group management strategies & when you might use each
  • Use the API for Java, C#/.NET, or Python to create a Consumer
  • Create & manage consumers & their property files
  • Illustrate how consumer groups & partitions provide scalability & fault tolerance
  • Explore managing consumer offsets
  • Tune fetch requests
  • Explain how consumer groups are managed & their benefits
  • Compare & contrast group management strategies & when you might use each
  • Use the API for Java, C#/.NET, or Python to create a Consumer
  • Describe Kafka schemas & how they work
  • Write an Avro compatible schema & explore using Protobuf & JSON schemas
  • Write schemas that can evolve
  • Write & read messages using schema-enabled Kafka client applications
  • Using Avro, the API for Java, C#/.NET, or Python, write a schema-enabled producer or consumer that leverages the Confluent Schema Registry
  • Develop an appreciation for what streaming applications can do for you back on the job
  • Describe Kafka Streams & explore steams properties & topologies
  • Compare & contrast steams & tables, & relate events in streams to records/messages in topics
  • Write an application with the help of the Streams DSL (Domain-Specific Language)
  • Describe how Kafka Streams & ksqlDB relate
  • Explore the ksqlDB CLI
  • Use ksqlDB to filter & transform data
  • Compare & contrast types of ksqlDB queries
  • Leverage ksqlDB to perform time-based stream operations
  • Write a ksqlDB query that relates data between two streams or a stream & a table
  • List some of the components of Kafka Connect & describe how they relate
  • Set configurations for components of Kafka Connect
  • Describe connect integration & how data flows between applications & Kafka
  • Explore some use cases where Kafka Connect makes development efficient
  • Use Kafka Connect in conjunction with other tools to process data in motion in the most efficient way
  • Create a Connector & import data from a database to a Kafka cluster
  • Delve into how compaction affects consumer offsets
  • Explore how consumers work with offsets in scenarios outside of normal processing behaviour & understand how
  • to manipulate offsets to deal with anomalies
  • Evaluate decisions about consumer & partition counts & how they relate
  • Address decisions that arise from default key-based partitioning & consider alternative partitioning strategies
  • Configure producers to deliver messages without duplicates & with ordering guarantees
  • List ways to manage large message sizes
  • Describe how to work with messages in transactions & how Kafka enables transactions
  • Compare & contrast error handling options with Kafka Connect, including the dead letter queue
  • Distinguish between various categories of testing
  • List considerations for stress & load test a Kafka system

FAQs

A: Confluent Platform is a full-scale data streaming platform that enables you to easily access, store, & manage data as continuous, real-time streams.

 

A: Confluent enables simple, modern streaming data pipelines & integration — the E & L in ETL — through pre-built data connectors.

A: Confluent gives a truly cloud-native experience, completing Kafka with a holistic set of enterprise grade features to unleash developer productivity, operate efficiently at scale, & meet all of your architectural necessities before moving to production.

A: AWS Partner Confluent relies on its AWS Service Ready designations to meet the security needs of customers in highly regulated industries. By demonstrating its validated AWS technology expertise, Confluent is helping customers increase data protection, enhance data analytics, & drive business growth.

A: Confluent Cloud enables you to sign up & use the product with a temporary free trial, which allows you to test & evaluate the platform without entering your payment information. To use the Confluent Cloud free trial, sign up for Confluent Cloud.

A: A Confluent organization is a resource that gives the mapping between the Azure & Confluent Cloud resources. It's the parent resource for other Confluent Cloud resources. Each Azure subscription can contain multiple Confluent plans.

A: To attend the training session, you should've operational Desktops or Laptops with the required specifications, along with a good internet connection to access the labs. 

A: We would always recommend you attend the live session to practice & clarify the doubts instantly & get more value from your investment. However, if, due to some contingency if you have to skip the class, Radiant Tech learning will help you with the recorded session of that particular day. However, those recorded sessions are not meant only for personal consumption & NOT for distribution or any commercial use.

A: Radiant Tech learning has a data center containing the Virtual Training environment for the purpose of professional hand-on-practice. Professionals can easily access these labs over Cloud with the help of a remote desktop connection. Radiant virtual labs provide you the flexibility to learn from anywhere in the world & in any time zone. 

A: The learners will be enthralled as we engage them the real-world & Oriented industry projects during the training program. These projects will improve your skills & knowledge, & you will gain a better experience. These real-time projects will help you a lot in your future tasks & assignments.

 

A: You can request a refund if you do not wish to enroll in the training.

A: Yes, you can.

A: We use the best standards in Internet security. Any data retained isn't shared with third parties.

A: It is recommended but not mandatory. Being acquainted with the primary training material will enable professionals & the trainer to move at the desired pace during classes. You can access training for most vendors.

 

A: You can buy online from the page by clicking on "Buy Now". You can view alternate payment methods on the payment options page.

 

A: Yes, professionals can pay from the training page.

A: The training completion certification will be awarded to all the professionals who've completed the training program & the project assignment given by your instructor. You may use the certificate in your future job interviews will surely help you to l& your dream job.

 

Ans- Radiant believes in a practical & creative approach to training & development, which distinguishes it from other training & developmental platforms. Moreover, training is undertaken by some experts with a range of experience in their domain.

 

Ans- Radiant team of experts will be available at e-mail support@radianttechlearning.com to answer your technical queries even after the training program.

 

 Ans- Yes, Radiant will provide you most updated high, value & relevant real-time projects & case studies in each training program. 

 

Ans- Technical issues are unpredictable & might occur with us as well. Professionals have to ensure they have access to the required configuration with good internet speed.

 

Ans- Radiant Techlearning offers training programs on weekdays, weekends & combination of weekdays & weekends. We provide you with complete liberty to choose the schedule that suits your need.

 

A: Radiant has highly intensive selection criteria for Technology Trainers & Professionals who deliver training programs. Our trainers & professionals undergo rigorous technical & behavioral interviews & assessment processes before they are on-boarded in the company.

Our Technology experts & professionals carry deep-dive knowledge in the technical subject & are certified from the OEM.

Our training programs are practically oriented with 70% – 80% hands on the training technology tool. Our training program focuses on one on one interaction with each professional, the latest content in curriculum, real-time projects & case studies during the training program.

Our faculty will provide the knowledge of each training from the fundamental level in an easy way & you are free to ask your doubts any time from your respective faculty.

Our trainers have patience & ability to explain difficult concepts in a simplistic way with depth & width of knowledge.

To ensure quality learning, we provide a support session even after the training program.

 

Send a Message.


  • Enroll