kafka-with-go

command module
v0.0.0-...-9f2e4cd Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 2, 2019 License: Apache-2.0 Imports: 9 Imported by: 0

README

Go Report Card Build Status Code of Conduct License

Overview

This is a small playground example for using Apache Kafka with Go. Initially, I used Shopify's samara library but switched to Confluent's kafka library due to easier use and more examples. Note that my final opinion has not been fully set, yet.

Currently, you can create and list topics and inject random messages of pre-defined length and size as well as consume them (see below).

Build

The underlying library needs librdkafka, hence it has to be installed using

brew install librdkafka

(For linux, see the Dockerfile) . Afterwards, a simple

go build

builds the binary.

Usage

To create a topic specify the topic and the number of partitions; note that the replication factor is always 1

kafka-with-go -create -topic demo -partitions 4

To list all topics, call

kafka-with-go -list

which shows all (including internal) topics and their number of partitions.

To inject messages into kafka, define the topic, the number of messages and their length

kafka-with-go -produce -number 10000 -length 10 -topic demo

and to consume messages, define the topic and the group id

kafka-with-go -consume -topic demo -group 1                

There are optional additional flags you can specify, e.g.

  • tick to define how often status message are printed
  • broker to set the broker (default is localhost:9092)
  • number in the consumer to stop receiving messages after the predefined number of messages

Docker

The (multi-stage) Dockerfile builds a container without external dependencies, which can then be used in arbitrary environments, e.g. in docker compose or kubernetes; a corresponding docker-compose.yml is provided. Use it as follows:

# Once at the beginning
docker-compose build

# Everytime...
docker-compose up
docker-compose exec go ash

and use the aforementioned kafka-with-go commands in the provided shell.

Full example

# Create topic
$ kafka-with-go -create -topic github -partitions 4
2019/12/01 13:37:16 Starting topic creation
2019/12/01 13:37:16 Creating topic github with 4 partitions
2019/12/01 13:37:16 Finished topic creation

# Add 1k messages with length 128bytes each; rather slow on my old machine ¯\_(ツ)_/¯ ... 
$ kafka-with-go -produce -topic github -number 1000 -length 128 -tick 100
2019/12/01 13:38:05 Starting producer
2019/12/01 13:38:05 Sending message 100/1000
2019/12/01 13:38:06 Sending message 200/1000
2019/12/01 13:38:06 Sending message 300/1000
2019/12/01 13:38:07 Sending message 400/1000
2019/12/01 13:38:07 Sending message 500/1000
2019/12/01 13:38:07 Sending message 600/1000
2019/12/01 13:38:08 Sending message 700/1000
2019/12/01 13:38:08 Sending message 800/1000
2019/12/01 13:38:09 Sending message 900/1000
2019/12/01 13:38:09 Sending message 1000/1000
2019/12/01 13:38:09 Finished producer

# Consume 1k messages
$ kafka-with-go -consume -topic github -number 1000 -tick 100 -group 1
2019/12/01 13:39:06 Starting consumer
2019/12/01 13:39:10 Received 100 messages (100 new)
2019/12/01 13:39:10 Received 200 messages (100 new)
2019/12/01 13:39:10 Received 300 messages (100 new)
2019/12/01 13:39:10 Received 400 messages (100 new)
2019/12/01 13:39:10 Received 500 messages (100 new)
2019/12/01 13:39:10 Received 600 messages (100 new)
2019/12/01 13:39:10 Received 700 messages (100 new)
2019/12/01 13:39:10 Received 800 messages (100 new)
2019/12/01 13:39:10 Received 900 messages (100 new)
2019/12/01 13:39:10 Received 1000 messages (100 new)
2019/12/01 13:39:10 Consumer read 1000 entries
2019/12/01 13:39:10 Finished consumer   

License

As always, the source code is licensed under Apache license 2.0.

Documentation

Overview

Consume submitted messages and list available topics (since both are consume-operations, we define them in the same file). Note that we discard the messages for now.

Handle command line parsing.

Playground examples for using kafka with go. The general workflow is as follows: 1) create topic 2) check success by listing it 3) produce some data and 4) consume data.

Produces new messages by submitting them to the topic. Note that currently, random messages of a fixed length are sent.

Create a new topic.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL