Lab - Confluent on the VM
Introduction
This is a simple lab with Kafka demonstrating pub/sub. It does not use docker. Students will down load and install Confluent, start zookeeper, the brokers and a schema. They will then start a producer and consumer process.
Links are here: https://docs.confluent.io/3.3.0/installation/installing_cp.html#debian-and-ubuntu
and here:
Steps are as:
Start the VM - vagrant up. In the VM do:$ wget -qO - add -
Start Confluent ZooKeeper, Kafka, and Schema Registry services using the Command Line Interface.
Should see the services start up
Start the Kafka Avro Console Producer utility. It is directed at your local Kafka cluster and is configured to write to topic test
, read each line of input as an Avro message, validate the schema against the Schema Registry at the specified URL, and finally indicate the format of the data.
Open a terminal
After the producer is started, the process will wait for you to enter messages and your terminal may appear idle.
Enter a single message per line and press the Enter
key to send them immediately. Try entering a couple of messages:
Now we can check that the data was produced by using Kafka’s console consumer process to read data from the topic. We point it at the same test
topic, our ZooKeeper instance, tell it to decode each message using Avro using the same Schema Registry URL to look up schemas, and finally tell it to start from the beginning of the topic (by default the consumer only reads messages published after it starts).
Open another terminal:
Last updated