Login     Signup
   info@zetlantechnologies.com        +91-8680961847

Red Hat Certified Specialist in Event-Driven Development with Kafka

Ex482: Red Hat Certified Specialist in Event-Driven Development with Kafka knowledge with regard to coding event-driven applications using Apache Kafka and developing Apache Kafka Streams. which also counts toward earning a Red Hat Certified Architect (RHCA®) certification. Java developers and architects who are implementing event-driven applications using Apache Kafka and Kubernetes.

1. Understand and work with event-driven applications with AMQ Streams API

  • Know how to send and read data from Kafka.
  • Develop microservices and other types of applications to share data with extremely high throughput and low latency.

  • Create, configure, and manage topics.
  • Configure the ecosystem to share data with extremely high throughput and low latency.
  • Scale and guarantee message ordering.
  • Message compaction to remove old records, and how to set them.
  • Configuration and use of the replication of data to control fault tolerance.
  • Retention of high volumes of data for immediate access.

  • Access the external listeners of Kafka on the cloud. In the cases of Kubernetes or Red Hat OpenShift, connect via node ports, load balancers, and externally, using an ingress or OpenShift route.
  • Understand how to configure the security of the communications between the Kafka client and the cluster.
  • Produce and consume messages and implement event-driven and data-streaming applications
  • Understand and provide the Kafka client configuration for the required authentication and authorization security.

  • Understand and work with the different Kafka Streams APIs like Streams DSL and Processor API.
  • Configure and provide the proper Kafka SerDes (Serializer/Deserializer) for the records to correctly materialize the data
  • Execute complex operations like mapping, filtering or joining, repartition and/or grouping, and write the results into one or more output streams.
  • Understand the stream-table duality and perform stateful operations like joins, aggregations, and windowed joins.
  • Understand how to define and connect custom processors and transformers to interact with state stores using the Processor API.
  • Understand the event manipulation deriving new collections from existing ones and describing changes between them.

  • Kafka Connect provides reliability and scalability data transferring between Kafka and other heterogeneous data systems.
  • Kafka Connect facilitates data conversion, transformation, and offset management.
  • Apply the detecting and capturing data changes (CDC) with Debezium.
  • Understand the different stand-alone/distributed running modes and their use cases.
  • Use the pre-built AMQ Streams connectors.

  • Recognize and work in an application with Event Sourcing and CQRS patterns
  • Advanced techniques like long-running business transactions with Saga orchestration and outbox patterns to exchange data between different services.

  • Maintaining message ordering
  • Retries and Idempotency
  • Handling duplicate events
  • Implement Streams test cases


Fees Structure : 12500 INR / 150 USD
Total No of Class : 33 Video Class
Class Duration : 47:30 Working Hours
Download Feature : Download Avalable
Technical Support : Call / Whatsapp : +91 8680961847
Working Hours : Monday to Firday 9 AM to 6 PM
Payment Mode : Credit Card / Debit Card / NetBanking / Wallet (Gpay/Phonepay/Paytm/WhatsApp Pay)

Brochure       Buy Now       Sample Demo

Fees Structure : 18500 INR / 230 USD
Class Duration : 45 Days
Class Recording : Live Class Recording available
Class Time : Monday to Firday 1.5 hours per day / Weekend 3 Hours per day
Technical Support : Call / Whatsapp : +91 8680961847
Working Hours : Monday to Firday 9 AM to 6 PM
Payment Mode : Credit Card / Debit Card / NetBanking / Wallet (Gpay/Phonepay/Paytm/WhatsApp Pay)

Download Brochure       Pay Online