Building a Producer and Consumer using Apache Kafka and Node.js

Posted by


Apache Kafka is a distributed streaming platform that allows you to publish and subscribe to streams of records. In this tutorial, we will learn how to create a producer and consumer in Node.js using the kafka-node library.

Step 1: Setting up Apache Kafka
Before you start creating your Node.js producer and consumer, you need to set up Apache Kafka on your local machine or on a remote server. You can follow the official Apache Kafka documentation to set up Apache Kafka on your machine.

Step 2: Install kafka-node library
You will need to install the kafka-node library in your Node.js project to interact with Apache Kafka. You can install the kafka-node library using npm with the following command:

npm install kafka-node

Step 3: Creating a Kafka Producer
Now, let’s create a Kafka producer in Node.js. In a new Node.js file, import the kafka-node library and create a new Kafka producer.

const kafka = require('kafka-node');
const Producer = kafka.Producer;
const client = new kafka.KafkaClient();

const producer = new Producer(client);

Next, we need to connect the producer to the Kafka broker and send messages to a Kafka topic. We can use the send method of the producer to send messages to a topic.

producer.on('ready', () => {
  const payloads = [
    { topic: 'my-topic', messages: 'Hello, Kafka!' }
  ];

  producer.send(payloads, (err, data) => {
    if (err) {
      console.error(err);
    } else {
      console.log(data);
    }
  });
});

Step 4: Creating a Kafka Consumer
Now, let’s create a Kafka consumer in Node.js. In a new Node.js file, import the kafka-node library and create a new Kafka consumer.

const kafka = require('kafka-node');
const Consumer = kafka.Consumer;
const client = new kafka.KafkaClient();

const consumer = new Consumer(client, [{ topic: 'my-topic' }]);

Next, we need to listen for messages from the Kafka topic. We can use the on('message') method of the consumer to listen for messages.

consumer.on('message', (message) => {
  console.log(message);
});

Step 5: Running the Producer and Consumer
Now that we have created our Kafka producer and consumer, we can run the producer and consumer in separate Node.js processes to send and receive messages.

To run the producer, save the producer code in a file (e.g., producer.js) and run the file using Node.js:

node producer.js

To run the consumer, save the consumer code in a file (e.g., consumer.js) and run the file using Node.js:

node consumer.js

You should see the message "Hello, Kafka!" printed in the console by the consumer, indicating that the consumer successfully received the message sent by the producer.

In this tutorial, we have learned how to create a Kafka producer and consumer in Node.js using the kafka-node library. You can now start building real-time data streaming applications using Apache Kafka and Node.js.

0 0 votes
Article Rating

Leave a Reply

14 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
@lemoncode21
3 hours ago

Table of content

00:00:19 Setup main nodejs express
00:02:07 Create config kafka
00:05:27 Create Controller
00:07:47 Setup docker kafka
00:09:07 Test Pub sub

@mohamedsalahsfar8717
3 hours ago

Would you please provide me the docker compose file configuration?

@PrashantShekher
3 hours ago

Please provide the details for Idempotent case too with running snippet

@valdineidossantos103
3 hours ago

How to consume only 10 latest messages ?

@AjeetMaurya-b6g
3 hours ago

docker compose file does not work for me. I copied and pasted exactly same file as attached in one of the reply.

@manhamvan5909
3 hours ago

using this docker- compose.yml, but it not working:
version: '3.3'
services:
zookeeper:
image: 'bitnami/zookeeper:3.7.0'
container_name: zookeeper
ports:
– '2181:2181'
environment:
– ALLOW_ANONYMOUS_LOGIN=yes
volumes:
– ./bitnami/zookeeper:/bitnami/zookeeper

kafka:
image: 'bitnami/kafka:2.8.0'
container_name: kafka
ports:
– '9093:9093'
expose:
– '9093'
environment:
– KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
– KAFKA_CREATE_TOPICS="kafka_capstone_event_bus:1:1"
– KAFKA_CFG_AUTO_CREATE_TOPICS_ENABLE=true
– KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP=CLIENT:PLAINTEXT,EXTERNAL:PLAINTEXT
– KAFKA_CFG_LISTENERS=CLIENT://:9092,EXTERNAL://:9093
– KAFKA_CFG_ADVERTISED_LISTENERS=CLIENT://kafka:9092,EXTERNAL://localhost:9093
– KAFKA_INTER_ BROKER_LISTENER_NAME=CLIENT
– ALLOW_PLAINTEXT_LISTENER=yes
depends_on:
– zookeeper
volumes:
– ./bitnami/kafka:/bitnami/kafka

kafdrop:
image: obsidiandynamics/kafdrop
container_name: kafdrop
ports:
– "9000:9000"
environment:
KAFKA_BROKERCONNECT: "kafka:9092"
JVM_OPTS: "-Xms16M -Xmx48M -Xss180K -XX:-TieredCompilation -XX:+UseStringDeduplication -noverify"
depends_on:
– kafka

@abelwondwosen1078
3 hours ago

hey can you please share docker-compose file?

@new-wt3zg
3 hours ago

Very thank you😊

@20240Z
3 hours ago

Hello, thank for your mentoring. do I need to change the docker compose file configure without zookeeper? If so, how do I change it?

@nathan_falkon36
3 hours ago

hi! thank you for the video, can you share the code for the docker compose also ?

@yarinh8417
3 hours ago

somebody know why when iam runnung the consumer i get warnings of imbalance? also in my code i disconnect the consumer each time its the code finish to consume

@d-landjs
3 hours ago

Excellent master!

@fauzi5848
3 hours ago

Bang, tolong jelasin tools2 yg berkaitan dengan penggunaan Kafka dong, semisal Zookeeper sama Kafdrop kaya di video itu gunanya untuk apa aja, dan kenapa butuh itu. Atau, ada tools2 lain yang biasa dipakai saat ngedevelop aplikasi yang menggunakan Kafka.
Terima kasih!

@wellerson-full
3 hours ago

github ?

14
0
Would love your thoughts, please comment.x
()
x