Javascript Kafka Producer Example

Kafka producer plugin that allows you to convert a Structured Record into CSV or JSON. In our example, we will use MapR Event Store for Apache Kafka, a new distributed messaging system for streaming event data at scale. close() Simple consumer. In below example, assume that we're implementing a basic notification application which allow users to subscribe to receive notifications from other users. Today, we will discuss Kafka Consumer. ms=1000 # The producer groups together any records that arrive in between request transmissions into a single batched request. The Kafka tutorial has example Java Kafka producers and Kafka consumers. You create a new replicated Kafka topic called my. There are two projects included in this repository: Producer-Consumer: This contains a producer and consumer that use a Kafka topic named test. Maven dependencies required for Kafka Java producer In order to write a kafka producer in java, we need to add following maven dependency (kafka-java-client) to our pom. Open a new command prompt and move to directory C:/kafka_2. Perhaps more useful, is to have a mixture, for example, a C++ producer sending messages to a Python consumer, or vice-versa. The Producer is used for publishing messages into the Kafka stream. My objective here is to show how Spring Kafka provides an abstraction to raw Kafka Producer and Consumer API's that is easy to use and is familiar to someone with a Spring background. (for example, kafka. The application will essentially be a simple proxy application and will receive a JSON containing the key that's going to be sent to kafka topic. Both Producer and Consumer in. 0 version this project is a complete rewrite based on the new spring-kafka project which uses the pure java Producer and Consumer clients provided by Kafka 0. The kafka-console-producer is a program included with Kafka that creates messages from command line input (STDIN). The tables below may help you to find the producer best suited for your use-case. This means I don't have to manage infrastructure, Azure does it for me. PARTITIONER_CLASS_CONFIG , which matches the fully qualified name of our CountryPartitioner class. Based on this configuration, you could also switch your Kafka producer from sending JSON to other serialization methods. Move updated (new temporary) table to original table. In last blog we have learned how to install Kafka in Linux. Apache Kafka : New producer API in 0. Java Kafka producer example We have covered different configurations and APIs in previous sections. The code for this example can be found in the kafka-operator repository at hack/kafka-test-pod/main. For example, a connector to a relational database might capture every change to a table. These are the top rated real world C# (CSharp) examples of KafkaNet. (Step-by-step) So if you’re a Spring Kafka beginner, you’ll love this guide. Apache Kafka with Node. Before we dive in deep into how Kafka works and get our hands messy, here's a little backstory. JS for interacting with Apache Kafka, I have described how to create a Node. In my previous post here, I set up a “fully equipped” Ubuntu virtual machine for Linux developement. Further fixes to make connection more robust on kafka up/down Multi host per broker Multi Topic for consumer. While creating a producer we need to specify Key and Value Serializers so that the API knows how to serialize those values. On this section, we will learn the internals that compose a Kafka producer, responsible for sending messages to Kafka topics. If you haven’t already, check out my previous tutorial on how to setup Kafka in docker. 1 is now available, utilizing the spring-kafka 1. Combining Reactive Streams, Heroku Kafka, and Play Framework. Apache Kafka is an open source distributed pub/sub messaging system originally released by the engineering team at LinkedIn. In this post will see how to produce and consumer User pojo object. I’m running my Kafka and Spark on Azure using services like Azure Databricks and HDInsight. The producer is thread safe and sharing a single producer instance across threads will generally be faster than having multiple instances. First, start Kafka …. \kafka\config\server. sh --zookeeper localhost:2181 --topic topic1 [2017-03-12 06:12:34,375] WARN Fetching topic metadata with correlation id 0 for topics [Set(eventbustopic)] from. In all our examples Kafka has been used just for data transportation with any necessary transformation happening in the target datastore like BigQuery, with the usage of languages like Python and engines like Spark Streaming or directly in the querying tool like Presto. We will be creating a kafka producer and consumer in Nodejs. You can rate examples to help us improve the quality of examples. My first reaction was not great. With the Java client, you can use batch. Romancing the Confluent Platform 2. This post walks you through the process of Streaming Data from Kafka to Postgres with Kafka Connect AVRO, Schema Registry and Python. Apache Kafka is an open source, distributed publish-subscribe messaging system, mainly designed with the following characteristics:. Cloudurable provides Kafka training, Kafka consulting, Kafka support and helps setting up Kafka clusters in AWS. We have also expanded on the Kafka design section and added references. Net Core Consumer for Kafka. To create a topic for example we looked at how to use kafka. For example, a connector to a relational database might capture every change to a table. Each record is routed and stored in a specific partition based on a partitioner. Learn how to use the Apache Kafka Producer and Consumer APIs with Kafka on HDInsight. In this post, we’ll see how to create a Kafka producer and a Kafka consumer in a Spring Boot application using a very simple method. Kafka's history. So, the data gets distributed among each partition of Topic-T under each broker, i. Start with Kafka," I wrote an introduction to Kafka, a big data messaging system. Kafka Template accepts producer factory and internally creates producer object whenever required. This client could be modified to be asynchronous by introducing a queue and executor pool to KafkaWriter. A purpose I could imagine would be the testing of a certain business logic that would trigger a kafka producer eventually. Here is an example of the recommended architecture for using the UDP to Kafka origin: If you need to process data before writing it to Kafka, need to write to a destination system other than Kafka, or if the origin does not need to process high volumes of data, use the UDP Source origin. Our multinational business uses it for asynchronous communication, search, social network activity stream, data pipeline, even in its trade processes. In this kafka example we need to build at least 5 images: 2 for consumer, 1 producer and one for kafka and one for zookeper. If the key is null, Kafka uses random partitioning for message. For a list of other Kafka resources, see Kafka Tutorials page. In this article, we will be using spring boot 2 feature to develop a sample Kafka subscriber and producer application. In this tutorial, we shall learn Kafka Producer with the help of Example Kafka Producer in Java. It assumes the reader is already familiar with Kafka architectural components such as Producers, Consumers, and Topics. Writing a Spring Boot Kafka Producer We'll go over the steps necessary to write a simple producer for a kafka topic by using spring boot. First, start Kafka …. This is a complete end to end example. In this article we will see how to write a kafka producer in java to write data on kafka cluster. The following table lists the data formats supported by each origin. (Step-by-step) So if you're a Spring Kafka beginner, you'll love this guide. Maven dependencies required for Kafka Java producer In order to write a kafka producer in java, we need to add following maven dependency (kafka-java-client) to our pom. Some features will only be enabled on newer brokers. 9 kafka brokers. This example demonstrates how to use Apache Avro to serialize records that are produced to Apache Kafka while allowing evolution of schemas and nonsynchronous update of producer and consumer applications. Apache Kafka is the buzz word today. Spark Streaming + Kafka Integration Guide (Kafka broker version 0. Perform a fully compatible schema evolution Confluent REST Proxy: Learn how to use the REST Proxy with a REST Client (Insomnia) in order to interface with Apache Kafka using REST. By the end of this video, you will have a sound understanding of Apache Kafka producer API, and you. Kafka tutorial #1 - Simple Kafka producer in Kotlin. Today, we will discuss Kafka Consumer. This tutorial will present an example of streaming Kafka from Spark. Again we have three mandatory configuration properties to pass: bootstap. sh --broker-list :9094 --topic For more details, see the Strimzi documentation. Run the following: bin/kafka-console-producer. Kafka is the leading open-source, enterprise-scale data streaming technology. See spring. kafka spring-kafka 2. In our example, the consumer queries Kafka for the highest offset of each partition, and then only waits for new messages. , on windows. It helps you move your data where you need it, in real time, reducing the headaches that come with integrations between multiple source and target systems. Let's start coding one simple Java producer, which will help you create your own Kafka producer. Topics, consumers, producers etc. The input stream will be records giving the new location of a person. In this tutorial, we are going to create simple Java example that creates a Kafka producer. 8 Direct Stream approach. Version = sarama. This section gives a high-level overview of how the producer works, an introduction to the configuration settings for tuning, and some examples from each client library. A Kafka client that publishes records to the Kafka cluster. create a Kafka Client and Producer using Node module kafka-node process one record at a time, and when done schedule the next cycle using setTimeOut with a random delay turn each parsed record into an object and publish the JSON stringified representation to the Kafka Topic. Java is the example here, but there are several libraries, including a good one for. sh config/zookeeper. The Connector API allows building and running reusable producers or consumers that connect Kafka topics to existing applications or data systems. An example of a producer application could be a web server that produces "page hits" that tell when a web page was accessed, from which IP address, what the page was and how long it took. sh --describe --zookeeper localhost:2181 --topic sample Creating Producer and Consumer. This tutorial will present an example of streaming Kafka from Spark. In this example we use the JVM truststore to talk to the MSK cluster. 10 is similar in design to the 0. Figure 1-8 shows an example of an architecture that uses MirrorMaker, aggregating messages from two local clusters into an aggregate cluster, and then copying that cluster to. In order to send messages with both keys and values you must set the --parse. 0 documentation) 2. Writing Text File contents to Kafka with Kafka Connect When working with Kafka you might need to write data from a local file to a Kafka topic. 2, there is a newer, better and faster version of the Producer API. In this example, because the producer produces string message, our consumer use StringDeserializer which is a built-in deserializer of Kafka client API to deserialize the binary data to the string. If we had started the producer before the consumer, the messages would have been silently ignored. Kafka Producer Partitioning Scheme. These examples are extracted from open source projects. There are two projects included in this repository: Producer-Consumer: This contains a producer and consumer that use a Kafka topic named test. These prices are written in a Kafka topic (prices). Kafka can maintain a large amount of data without any issue. Every time a producer pub-lishes a message to a broker, the broker simply appends the message to the last segment file. A Kafka client that publishes records to the Kafka cluster. the class that implements kafka. npm init kafka-node-countries. 10 is similar in design to the 0. - [Instructor] Okay, so wouldn't it be nice…if we had the option to understand…where the message was produced,…and understand if it was produced correctly…and what's the offset value, and the…partition number, et cetera, et cetera. Here is a simple example of using the producer to send records with strings containing sequential numbers as the key/value pairs. You can vote up the examples you like and your votes will be used in our system to generate more good examples. Motivation At early stages, we constructed our distributed messaging middleware based on ActiveMQ 5. Before we dive in deep into how Kafka works and get our hands messy, here's a little backstory. But this time, we want to get a response and handle an exception, so we wrap it in a try-catch and finally blocks. 0 Add in High level producer/consumer. Let's see an example. This means I don’t have to manage infrastructure, Azure does it for me. Moreover, we will see KafkaProducer API and Producer API. Example: processing streams of events from multiple sources with Apache Kafka and Spark. At the cloned repository, we have a folder called kafka-using-java, that contains one producer example, using our. This section gives a high-level overview of how the producer works, an introduction to the configuration settings for tuning, and some examples from each client library. This comprehensive Kafka tutorial covers Kafka architecture and design. In older versions of Kafka, we basically used the code called by the kafka-topics. If you attempt to retrieve the message key from the example-topic it will be null. Below class determines the partitioning in the topic where the message needs to be sent. Spring Boot Kafka Producer Example: On the above pre-requisites session, we have started zookeeper, Kafka server and created one hello-topic and also started Kafka consumer console. Make connection more robust on kafka up/down due to bugs and problem points in in kafka-node. Net Core Central. 8 Direct Stream approach. Kafka producers send records to topics. This is a complete end to end example. Kafka allows us to create our own serializer and deserializer so that we can produce and consume different data types like Json, POJO e. NET framework. However, it is still possible for some producers and consumers to produce/consume huge volume of data and hence monopolize broker resources and cause network saturation. This section gives a high-level overview of how the producer works, an introduction to the configuration settings for tuning, and some examples from each client library. /bin/kafka-console-producer. I am working on basic pub/sub pattern using IICS on. 0, you can start using the Kafka endpoint from your existing applications with no code change but a minimal. (5 replies) Anyone have a python/avro producer that slurps up records from a flat file (i. For example, you can run the parent transformation on a timed schedule, or abort the sub-transformation if sensor data exceeds a preset range. In this tutorial, we are going to create simple Java example that creates a Kafka producer. Kafka Connect and the JSON converter is available as part of the Apache Kafka download. These credentials are also provided via a JVM config option. Apache Kafka Java Example (Producer + Consumer) In this tutorial, we will be developing a sample apache kafka java application using maven. Sample scenario The sample scenario is a simple one, I have a system which produces a message and another which processes it. From here and here. This comprehensive Kafka tutorial covers Kafka architecture and design. The KafkaTemplate now provides access to the metrics and partitionsFor methods on the Producer. This means that the operations done while a message is published on Kafka Topic partition are non-blocking. The basic properties of the producer are the address of the broker and the serializer of the key and values. Basic Producer Examples You should use the ProduceAsync method if you would like to wait for the result of your produce requests before proceeding. How does Kafka work? Kafka looks and feels like a publish-subscribe system that can deliver in-order, persistent, scalable messaging. If you look at the try block, we are still calling producer. $ /usr/bin/kafka-console-producer --broker-list kafka02. Well! There has to be a Producer of records for the Consumer to feed on. The examples in this repository demonstrate how to use the Kafka Consumer, Producer, and Streaming APIs with a Kafka on HDInsight cluster. , Broker 1, Broker2, and Broker 3. sh --broker-list localhost:9092 --topic test >Hello >World You start the console based producer interface which runs on the port 9092 by default. Apache Kafka uses another Apache project internally, namely Apache Zookeeper. 2 Add in High level producer. As and when I'm ready to deploy the code to a 'real' execution environment (for example EMR), then I can start to worry about that. JS application that publishes messages to a Kafka Topic (based on entries in a CSV file), how to create a simple Kafka Streams Java application that processes such messages from that TopicRead More. Origin Avro Binary Datagram Delimited Excel JSON Log Protobuf SDC Record Text Whole File XML Amazon S3 Amazon SQS Consumer. It is multi producer and multi consumer. How does Kafka work? Kafka looks and feels like a publish-subscribe system that can deliver in-order, persistent, scalable messaging. Refresh now. When you register a Data Collector, Data Collector generates an authentication token that it uses to issue authenticated requests to Control Hub. Kafka Tutorial. This post is the continuation of the previous post ASP. We have also expanded on the Kafka design section and added references. The Spring Integration Kafka extension project provides inbound and outbound channel adapters for Apache Kafka. bin/kafka-topics. In this blog, we’ll walk through an example of using Kafka Connect to consume writes to PostgreSQL, and automatically send them to Redshift. C# (CSharp) KafkaNet Producer - 30 examples found. , mix of string and binary data) and publishes to Kafka they would be willing to share? Starting to think this might be a whole lot faster to do in Java, but maybe someone has a Python solution already in hand. Spring Kafka Consumer Producer Example 10 minute read In this post, you’re going to learn how to create a Spring Kafka Hello World example that uses Spring Boot and Maven. import kafka. In such cases, one major pain point can be in the coordination of the agreed upon message format between producers and consumers. In this article, let us explore setting up a test Kafka broker on a Windows machine, create a Kafka producer, and create a Kafka consumer using the. sh utility, which is part of Apache Kafka to connect the cluster: bin/kafka-console-producer. Now we are going to push some messages to hello-topic through Spring boot application using KafkaTemplate and we will monitor these messages from Kafka consumer. The Kafka tutorial also covers Avro and Schema Registry. Net Core Streaming Application Using Kafka - Part 1. Let's take a deeper look at what Kafka is and how it is able to handle these use cases. This means that the operations done while a message is published on Kafka Topic partition are non-blocking. In this blog, we will show how Structured Streaming can be leveraged to consume and transform complex data streams from Apache Kafka. These are the top rated real world C# (CSharp) examples of KafkaNet. bat --broker-list localhost:9092 --topic javainuse-topic Hello World Javainuse Finally Open a new command prompt and start the consumer which listens to the topic javainuse-topic we just created above. Then a consumer will read the data from the broker and store them in a MongoDb collection. In Kafka, we can create n number of topics as we want. Kafka Producer Java API You can write producers in quite a few languages. Finally, bar is sent to the cluster by the OSN 0. Spring Boot Kafka Producer Example: On the above pre-requisites session, we have started zookeeper, Kafka server and created one hello-topic and also started Kafka consumer console. How does Kafka work? Kafka looks and feels like a publish-subscribe system that can deliver in-order, persistent, scalable messaging. Every line is a new message. Kafka Producer: Sending a message. Apache Kafka is publish-subscribe messaging rethought as a distributed commit log. MapR Event Store integrates with Spark Streaming via the Kafka direct approach. For example, a connector to a relational database might capture every change to a table. Kafak Sample producer that sends Json messages. Kafka producer client consists of the following APIâ s. Producer extracted from open source projects. Kafka Zookeeper Installation $ brew install kafka Command will automatically install Zookeeper as dependency. In older versions of Kafka, we basically used the code called by the kafka-topics. Example 1: Who's in the same location at the same time? So you can imagine what the rooms look like, here's the Cluedo mansion in The Sims. This tutorial demonstrates how to configure a Spring Kafka Consumer and Producer example. , mix of string and binary data) and publishes to Kafka they would be willing to share? Starting to think this might be a whole lot faster to do in Java, but maybe someone has a Python solution already in hand. We will take a look at the use of KafkaTemplate to send messages to Kafka topics, @KafkaListener annotation to listen to those. 9 & InfluxDB: A Simple Producer and Consumer Example Thou Shall Publish…Thy Shall Subscribe… For as long as there have been printing papers, there have been publishers and consumers. Spring Boot Apache Kafka. Moreover, we will see KafkaProducer API and Producer API. Start with Kafka," I wrote an introduction to Kafka, a big data messaging system. Our multinational business uses it for asynchronous communication, search, social network activity stream, data pipeline, even in its trade processes. (for example, kafka. Pipeline Statistics A Control Hub job defines the pipeline to run and the Data Collectors or Edge Data Collectors (SDC Edge) that run the pipeline. To test this example, you will need a Kafka broker running release 0. In this tutorial, we are going to create simple Java example that creates a Kafka producer. There are many configuration options for the consumer class. Apache Kafka is a pub-sub solution; where producer publishes data to a topic and a consumer subscribes to that topic to receive the data. This tutorial demonstrates how to configure a Spring Kafka Consumer and Producer example. In this blog, I will thoroughly explain how to build an end-to-end real-time data pipeline by building four micro-services on top of Apache Kafka. By default each line will be sent as a separate message. closeProducer KAFKA producer shutdown Description This function closes the KAFKA producer Usage rkafka. This post gives an overview of Apache Kafka and using an example use-case, shows how to get up and running with it quickly and easily. The following arguments are currently recognized:. We create a Message Consumer which is able to listen to messages send to a Kafka topic. Setup your free Apache Kafka instance here: https://www. In this Kafka Producer tutorial, let’s examine a Kafka Producer example and highlight some of the key features and customization options. A simple Kafka Consumer and Producer example. Then we can do so, using the below steps. In this post I will create a Producer in ASP. sh --zookeeper localhost:2181 --topic topic1 [2017-03-12 06:12:34,375] WARN Fetching topic metadata with correlation id 0 for topics [Set(eventbustopic)] from. Our multinational business uses it for asynchronous communication, search, social network activity stream, data pipeline, even in its trade processes. 1/z(For my case). Simplified embedded kafka configuration when using Spring Boot. Once installed, interacting with Kafka is relatively simple. But this time, we want to get a response and handle an exception, so we wrap it in a try-catch and finally blocks. Migrating Your Apache Kafka Cluster to Amazon MSK. Download the Kafka 0. A producer is an application that generates data but only to provide it to some other application. On this section, we will learn the internals that compose a Kafka producer, responsible for sending messages to Kafka topics. The producer creates the objects, convert (serialize) them to JSON and publish them by sending and enqueuing to Kafka. level config to read_committed to make sure that any consumed messages are from committed transactions. In this post I will create a Producer in ASP. Tutorialkart. Note that you should first create a topic named demo-topic from the Aiven web console. $ /usr/bin/kafka-console-consumer --zookeeper zk01. The below code snippet is from a client and it is written in Java. Partitions. In below example, assume that we're implementing a basic notification application which allow users to subscribe to receive notifications from other users. Kafka’s producer explained. Here is an example of the recommended architecture for using the UDP to Kafka origin: If you need to process data before writing it to Kafka, need to write to a destination system other than Kafka, or if the origin does not need to process high volumes of data, use the UDP Source origin. In this course, you’ll learn how to use Kafka to publish and subscribe to data streams, and how Kafka can be used to solve various use cases. x(prior to 5. Stop zabbix server. This article explains how to write Kafka messages to Kafka topic (producer) and read messages from topic (consumer) using Scala example; producer sends messages to Kafka topics in the form of records, a record is a key-value pair along with topic name and consumer receives a messages from a topic. What I am attempting to do is start up a Kafka producer on a remote server at the client site so that RTA states are delivered through the Kafka producer. We hope you enjoyed this article. This is a code example that how to use “kafka-python” package to write Kafka producer/consumer. (Step-by-step) So if you’re a Spring Kafka beginner, you’ll love this guide. Consume extracted from open source projects. This is a code example that how to use "kafka-python" package to write Kafka producer/consumer. By default each line will be sent as a separate message. Messages are consumed from one Kafka cluster and produced for another. We feel that Kafka can be very useful in many places outside of LinkedIn. You enter producer mode with the -P option. Spring Boot Kafka Producer Example: On the above pre-requisites session, we have started zookeeper, Kafka server and created one hello-topic and also started Kafka consumer console. Using Confluent's client builders. I will try to put some basic understanding about Apache Kafka and then we will go through a running example. Kafka Producer Java API You can write producers in quite a few languages. In this course, you’ll learn how to use Kafka to publish and subscribe to data streams, and how Kafka can be used to solve various use cases. Apache Kafka - Simple Producer Example - Let us create an application for publishing and consuming messages using a Java client. This is an example of a synchronous client. cloudkarafka. - [Instructor] Okay, so wouldn't it be nice…if we had the option to understand…where the message was produced,…and understand if it was produced correctly…and what's the offset value, and the…partition number, et cetera, et cetera. They are deserializers used by Kafka consumer to deserialize the binary data received from Kafka cluster to our desire data types. Creates new producer client object. In our example we’ll create a producer that emits numbers from 1 to 1000 and send them to our Kafka broker. 54 KB # This is one file from my current assignment where I am making a Kafka producer # that will allow the other parts of the application to send messages to Kafka. Let's see an example. CreateTopicCommand. Perhaps more useful, is to have a mixture, for example, a C++ producer sending messages to a Python consumer, or vice-versa. On the Kafka Producer side of things, check out kafka-console-producer examples. Let's start coding one simple Java producer, which will help you create your own Kafka producer. According to Kafka documentation : Kafka comes with a command line client that will take input from a file or I could not seem to find any documentation on how the the command line client can read from a file. In this post, we will be discussing how to stream Twitter data using Kafka. Apache Kafka with Node. Mark Needham I co-authored the O'Reilly Graph Algorithms Book with Amy Hodler. 0 with Apache Kafka 0. You can set the topic dynamically by using a format string to access any event field. send method. See spring. The producer and consumer components in this case are your own implementations of kafka-console-producer. To test this example, you will need a Kafka broker running release 0. For example, below configuration parameters can be used to set producer/consumer quota to 50 MB/sec - # Sets producer quota to 50 MB quota. Then, we’ll discuss a bash script that starts up a local Kafka cluster using Docker Compose, sends a set of test messages through the producer, and finally kills the consumer and resurrects it again in order to simulate a recovery. So I have also decided to dive in it and understand it. Kafka Producer API helps to pack the message and deliver it to Kafka Server. Stop zabbix server. Data is published to Kafka topics where it will become available for consumption by any number of consumers subscribing to the topic. close() Simple consumer. Then, go to the bin folder of the Apache Kafka installation, and run the following command. End-to-End Kafka Streams Application : Write the code for the WordCount, bring in the dependencies, build and package your application, and learn how to scale it. By default, Kafka keeps data stored on disk until it runs out of space, but the user can also set a retention limit. ms to a non-default value and wish send operations on this template to occur immediately, regardless of that setting, or if you wish to block until the broker has acknowledged receipt according to the producer's acks property. In this tutorial, we shall learn Kafka Producer with the help of Example Kafka Producer in Java. For example, alice could use a copy of the console clients for herself, in which her JAAS file is fed to the client command. Create an instance using the supplied producer factory and autoFlush setting. The examples in this repository demonstrate how to use the Kafka Consumer, Producer, and Streaming APIs with a Kafka on HDInsight cluster. Kafka allows us to create our own serializer and deserializer so that we can produce and consume different data types like Json, POJO e. We will be configuring apache kafka and zookeeper in our local machine and create a test topic with multiple partitions in a kafka broker. A few weeks ago we published a story on how we developed a Kafka “backbone” in order to get USGS flow information. This blog post shows how to configure Spring Kafka and Spring Boot to send messages using JSON and receive them in multiple formats: JSON, plain Strings or byte arrays. Also note that, if you are changing the Topic name, make sure you use the same topic name for the Kafka Producer Example and Kafka Consumer Example Java Applications. Mark Needham I co-authored the O'Reilly Graph Algorithms Book with Amy Hodler. For detailed information, check this repository on github. There are many configuration options for the consumer class. Kafka Producer Partitioning Scheme. A producer publishes data to the topics, and a consumer reads that data from the topic by subscribing it. Let's get started.