Apache Kafka Web UI for exploring messages, consumers, configurations and more with a focus on a good UI & UX.

Kowl - Apache Kafka Web UI

License Go Report Card GitHub release (latest SemVer) Discord Chat Docker Repository on Quay

Kowl (previously known as Kafka Owl) is a web application that helps you to explore messages in your Apache Kafka cluster and get better insights on what is actually happening in your Kafka cluster in the most comfortable way:

preview

Features

  • Message viewer: Explore your topics' messages in our message viewer through ad-hoc queries and dynamic filters. Find any message you want using JavaScript functions to filter messages. Supported encodings are: JSON, Avro, Protobuf, XML, Text and Binary (hex view). The used enconding (except Protobuf) is recognized automatically.
  • Consumer groups: List all your active consumer groups along with their active group offsets. You can view a visualization of group lags either by topic (sum of all partition lags), single partitions or the sum of all partition lags (group lag)
  • Topic overview: Browse through the list of your Kafka topics, check their configuration, space usage, list all consumers who consume a single topic or watch partition details (such as low and high water marks, message count, ...), embed topic documentation from a git repository and more.
  • Cluster overview: List available brokers, their space usage, rack id and other information to get a high level overview of your brokers in your cluster.

A roadmap is maintained using milestones.

Kowl Kowl Business
Topic Overview
Consumer Group Overview
Broker/Cluster Overview
Message viewer
Login System (Google, GitHub, Okta)
RBAC permissions with group syncing
Screenshots Preview .gif in README https://cloudhut.dev/
Price Always free / Open source Free during beta*

*If you want to participate in the free beta sign in here: https://license.cloudhut.dev/ . You'll get a forever free license for 2 seats. If you need more than 2 seats, just drop us an email at [email protected]

Getting Started

Prerequisites

  • Kafka Cluster (v1.0.0+) connectivity
  • At least one OAuth app for SSO (Kowl business only)
  • Internet connectivity to validate license (Kowl business only)

Installing

We offer pre built docker images for Kowl (Business), a Helm chart and a Terraform module to make the installation as comfortable as possible for you. Please take a look at our dedicated Installation documentation.

Quick Start

Do you just want to test Kowl against one of your Kafka clusters without spending too much time on the test setup? Here are some docker commands that allow you to run it locally against an existing Kafka cluster:

Kafka is running locally

Since Kowl runs in it's own container (which has it's own network scope), we have to use host.docker.internal as bootstrap server. That DNS resolves to the host system's ip address. However since Kafka brokers send a list of all brokers' DNS when a client has connected, you have to make sure your advertised listener is connected accordingly, e.g.: PLAINTEXT://host.docker.internal:9092

docker run -p 8080:8080 -e KAFKA_BROKERS=host.docker.internal:9092 quay.io/cloudhut/kowl:master

Docker supports the --network=host option only on Linux. So Linux users use localhost:9092 as advertised listener and use the host network namespace instead. Kowl will then be ran as it would be executed on the host machine.

docker run --network=host -p 8080:8080 -e KAFKA_BROKERS=localhost:9092 quay.io/cloudhut/kowl:master

Kafka is running remotely

Protected via SASL_SSL and trusted certificates (e.g. Confluent Cloud):

docker run -p 8080:8080 -e KAFKA_BROKERS=pkc-4r000.europe-west1.gcp.confluent.cloud:9092 -e KAFKA_TLS_ENABLED=true -e KAFKA_SASL_ENABLED=true -e KAFKA_SASL_USERNAME=xxx -e KAFKA_SASL_PASSWORD=xxx quay.io/cloudhut/kowl:master

I don't have a running Kafka cluster to test against

We maintain a docker-compose file that launches zookeeper, kafka and kowl: /docs/local.

Chat with us

We use Discord to communicate. If you are looking for more interactive discussions or support, you are invited to join our Discord server: https://discord.gg/KQj7P6v

Sponsors

License

Kowl is distributed under the Apache 2.0 License.

Comments
  • disable delete buttons

    disable delete buttons

    Hi,

    Is there any way I can disable the "delete" buttons in the Kowl UI at the places shown below (in the topics list and the messages list within a topic) ? We were planning to use KOWL as a replacement for Kafka TopicsUI only if we can disable those delete buttons.

    image

    Untitled2

  • Not able to see latest messages on kowl?

    Not able to see latest messages on kowl?

    I am using kowl to read messages, and are currently unable to read the latest messages I've tried increasing the ram, but it does not seem like it helps.

    I keep running into this backend error message? what is going on?

    image

  • NGINX support

    NGINX support

    Kowl does not support NGINX.

    I added a NGINX in front of Kowl and the UI is broken, because that many static resources are absolute and not relative, according to the location configured inside the nginx.conf

    I configured Kowl to be under admin/kafka path, but all the static resources are pointing to / I saw in the Frontend a configuration with %PUBLIC_URL% but it needs to be exported as a environment variable from outside of the container ( like in docker compose yml example bellow )

    Example for my nginx configuration:

          location /admin/kafka/ {
               proxy_pass             http://kowl:8077/;
               proxy_redirect  / /admin/kowl/;
               
               proxy_buffering                    off;
               proxy_set_header Host              $http_host;
               proxy_set_header X-Real-IP         $remote_addr;
               proxy_set_header X-Forwarded-For   $proxy_add_x_forwarded_for;
               proxy_set_header X-Forwarded-Proto $scheme;
           }
    

    image

    image

  • Feature Request: Make product improvements to `kafka connect`

    Feature Request: Make product improvements to `kafka connect`

    screencapture-localhost-8082-kafka-connect-2021-08-25-23_04_43

    • The number of list elements displayed exceeds the paging setting when the webpage first loads
    • Table overflow
    • The filter option box does not support state storage
    • Not support sorting and filtering by connector status
    • Not support jumping from the connector to its tasks
  • SCRAM-SHA-512 not working with MSK SASL_SSL

    SCRAM-SHA-512 not working with MSK SASL_SSL

    I have a MSK cluster, enabled username password auth. It works fine when I have for example kafka-connect with settings like this

      kafkastore.security.protocol: "SASL_SSL",
      kafkastore.sasl.mechanism: "SCRAM-SHA-512",
      kafkastore.sasl.jaas.config: org.apache.kafka.common.security.scram.ScramLoginModule required username='alice' password='alice-secret';
    

    But when use kowl to configure

         sasl:
           enabled: true
           username: alice
           password: alice-secret
           mechanism: SCRAM-SHA-512 
    

    It give me the error

    msg":"unable to initialize sasl","source":"kafka_client","broker":"seed 0","err":"Authentication failed during authentication due to invalid credentials with SASL mechanism SCRAM-SHA-512: SASL_AUTHENTICATION_FAILED: SASL Authentication failed."}

  • Add protobuf support for deserializing messages

    Add protobuf support for deserializing messages

    Users have reported via Email they are using Protobuf. It's been reported that the proto files are managed using a Git repository that contains all protos. It's also possible to use Confluent's schema registry to manage the proto files. Both should be supported. Currently it's probably more common that protos are managed in a Git repository as the support for Protobuf in the schema registry is a rather new feature.

    Challenges

    As of now we automatically detect the required deserializer. With protobuf there's no such thing like a magic byte which indicates that the given messages is serialized with Protobuf. Therefore we must offer users an option to choose the deserialization method and furthermore what proto the backend should use to deserialize a given message (key/value).

  • Configurable Max Message Count

    Configurable Max Message Count

    I was wondering if it was possible to make the max message count when consuming messages configurable? Right now it's hardcoded at 500, but we'd like to play around with higher values in some of our use cases. I was thinking have a MAX max set as part of the env vars for the backend, but then let people fiddle with that number on a case by case basis to a number at, or lower than that.

  • No partition watermark for the group's topic available error when loading consumer group

    No partition watermark for the group's topic available error when loading consumer group

    I have a problem when loading consumer group and topic -> consumer menu in frontend. I used kowl helm chart with quay.io/cloudhut/kowl:master image.

    I got this error in frondend image

    And I got this pod log.

    {"level":"error","ts":"2021-01-06T07:11:40.353Z","msg":"no partition watermark for the group's topic available","group":"XXX","topic":"XXX"} {"level":"error","ts":"2021-01-06T07:11:40.353Z","msg":"Sending REST error","topic_name":"XXX","route":"/api/topics/XXX/consumers","method":"GET","status_code":500,"remote_address":"X.X.X.X","public_error":"Could not list topic consumers for requested topic","error":"failed to get consumer group lags: no partition watermark for the group's topic available"}

    How can I fix this error?

  • Cannot read properties of null (reading 'length')

    Cannot read properties of null (reading 'length')

    Type: TypeError

    Message: Cannot read properties of null (reading 'length')

    Stack (Decoded): render (components/pages/consumers/Group.Details.tsx:66:98) sum (utils/arrayExtensions.ts:102:43) sum (utils/arrayExtensions.ts:102:16) render (components/pages/consumers/Group.Details.tsx:66:74) allowStateChanges (../../src/core/action.ts:149:15) reactiveRender (../../src/observerClass.ts:134:28) Array (../../src/core/derivation.ts:177:21) isSpyEnabled (../../src/core/reaction.ts:137:23) reactiveRender (../../src/observerClass.ts:132:8)

    Stack (Raw): TypeError: Cannot read properties of null (reading 'length') at https://kowl03.in.obsec.io/static/js/main.366aafe9.chunk.js:1:252151 at https://kowl03.in.obsec.io/static/js/main.366aafe9.chunk.js:1:4148 at Array.reduce () at Array.sum (https://kowl03.in.obsec.io/static/js/main.366aafe9.chunk.js:1:4117) at n.value (https://kowl03.in.obsec.io/static/js/main.366aafe9.chunk.js:1:252112) at Le (https://kowl03.in.obsec.io/static/js/2.c77d56f0.chunk.js:2:13359) at https://kowl03.in.obsec.io/static/js/2.c77d56f0.chunk.js:2:65736 at Xe (https://kowl03.in.obsec.io/static/js/2.c77d56f0.chunk.js:2:19292) at e.t.track (https://kowl03.in.obsec.io/static/js/2.c77d56f0.chunk.js:2:23628) at n.l [as render] (https://kowl03.in.obsec.io/static/js/2.c77d56f0.chunk.js:2:65701)

    Components:

    at n (https://kowl03.in.obsec.io/static/js/main.366aafe9.chunk.js:1:251171)
    at t (https://kowl03.in.obsec.io/static/js/2.c77d56f0.chunk.js:2:116535)
    at t (https://kowl03.in.obsec.io/static/js/2.c77d56f0.chunk.js:2:117486)
    at s (https://kowl03.in.obsec.io/static/js/2.c77d56f0.chunk.js:2:861432)
    at h (https://kowl03.in.obsec.io/static/js/2.c77d56f0.chunk.js:2:862141)
    at Xt
    at n (https://kowl03.in.obsec.io/static/js/main.366aafe9.chunk.js:1:302928)
    at main
    at h (https://kowl03.in.obsec.io/static/js/2.c77d56f0.chunk.js:2:528946)
    at a (https://kowl03.in.obsec.io/static/js/2.c77d56f0.chunk.js:2:528776)
    at section
    at https://kowl03.in.obsec.io/static/js/2.c77d56f0.chunk.js:2:529206
    at a (https://kowl03.in.obsec.io/static/js/2.c77d56f0.chunk.js:2:528776)
    at l (https://kowl03.in.obsec.io/static/js/2.c77d56f0.chunk.js:2:62025)
    at section
    at https://kowl03.in.obsec.io/static/js/2.c77d56f0.chunk.js:2:529206
    at a (https://kowl03.in.obsec.io/static/js/2.c77d56f0.chunk.js:2:528776)
    at t (https://kowl03.in.obsec.io/static/js/2.c77d56f0.chunk.js:2:116535)
    at t (https://kowl03.in.obsec.io/static/js/2.c77d56f0.chunk.js:2:117486)
    at n (https://kowl03.in.obsec.io/static/js/main.366aafe9.chunk.js:1:314272)
    at n (https://kowl03.in.obsec.io/static/js/main.366aafe9.chunk.js:1:346303)
    at t (https://kowl03.in.obsec.io/static/js/2.c77d56f0.chunk.js:2:113829)
    at t (https://kowl03.in.obsec.io/static/js/2.c77d56f0.chunk.js:2:563547)
    

    Environment: NODE_ENV : production GIT_SHA : 1177dfdcda669904b0fbfc54934dbeba322bf2af GIT_REF : v1.4.0 TIMESTAMP: 1622141499 appName : Kowl

    Location: Protocol: https: Path : /groups/elastic_persistence-vpqdpaoeqxxzdvjd-activity20200513 Search : Hash :

  • Support Protobuf Imports

    Support Protobuf Imports

    Fixes https://github.com/cloudhut/kowl/issues/237

    Fixed by removing the "/" at the beginning of all file paths and added "/" as import path to the proto parser

  • Schema registry view doesn't load

    Schema registry view doesn't load

    I am running version 5.0.4 of the schema registry, and I noticed that when I point Kowl at this registry and go to the "Schema Registry" view, the REST API times out after 25s. My hunch as to what is going on, which I can't backup with logs because I'm not sure how to see the backend logs, is that the /mode endpoint wasn't added to the schema registry until 5.2.X, and so this call fails. Would it make sense to have a N/A value for mode if the call to /mode returns a 404?

  • Add ability to download/upload connector configs

    Add ability to download/upload connector configs

    RedPanda is a very nice and feature-rich UI and has really helped our team improve our Kafka operations.

    One of the few features that my team still greatly admires from Confluent's Control Center (C3) is the ability to download connector configs (as JSON or properties files) and then similarly upload connector configurations.

    This is typically used for various purposes, such as backup and restore when a connector needs to be rebooted or during deployment to multiple environments with minor tweaks between the connector definitions. Consider how tedious and error-prone it would be to delete and relaunch a connector through the UI one configuration at a time.

    Would be nice if this feature saw its way into RedPanda console also.

    Thank you for providing this tool to the community and for all the hard work you've put into making this tool great.

  • Support regex in topic protobuf mappings

    Support regex in topic protobuf mappings

    Fixing #405

    This is still a work in progress (need more manual and unit tests) but If you find time you can validate you are OK with the direction I am heading.

    I have concerns about performance if every message end up matching with the last regex in a long list. I may add a benchmark to get an idea of the difference between regex matching vs directly matching in the map (I expect a huge difference).

  • When I have RedPanda UI open in two different tabs, the filters are acting weird

    When I have RedPanda UI open in two different tabs, the filters are acting weird

    @mehrdad-goudarzi commented on Tue Nov 22 2022

    Version & Environment

    Redpanda version: (use RELEASE-2.0 843FFF4):

    What went wrong?

    What should have happened instead?

    How to reproduce the issue?

    1. Open RedPanda UI
    2. Go to a specific topic
    3. Try to filter the topic using filters
    4. Open the same URL at the same time in a different tab
    5. click on the cross icon of created filter item
  • Add GH actions workflow for lint, format and unit tests

    Add GH actions workflow for lint, format and unit tests

    This PR adds two workflows:

    1. Lint Go Code
    2. Unit test Go Code

    I also added a task file for formatting the Go code and I ran the format task once and submitted that in a separate commit.

  • [Snyk] Upgrade mobx-react from 7.5.3 to 7.6.0

    [Snyk] Upgrade mobx-react from 7.5.3 to 7.6.0

    Snyk has created this PR to upgrade mobx-react from 7.5.3 to 7.6.0.

    :information_source: Keep your dependencies up-to-date. This makes it easier to fix existing vulnerabilities and to more quickly identify and fix newly disclosed vulnerabilities when they affect your project.


    • The recommended version is 1 version ahead of your current version.
    • The recommended version was released 21 days ago, on 2022-11-14.
    Release notes
    Package name: mobx-react from mobx-react GitHub release notes

    Note: You are seeing this because you or someone else with access to this repository has authorized Snyk to open upgrade PRs.

    For more information:

    🧐 View latest project report

    🛠 Adjust upgrade PR settings

    🔕 Ignore this dependency or unsubscribe from future upgrade PRs

  • [Snyk] Upgrade mobx from 6.6.2 to 6.7.0

    [Snyk] Upgrade mobx from 6.6.2 to 6.7.0

    Snyk has created this PR to upgrade mobx from 6.6.2 to 6.7.0.

    :information_source: Keep your dependencies up-to-date. This makes it easier to fix existing vulnerabilities and to more quickly identify and fix newly disclosed vulnerabilities when they affect your project.


    • The recommended version is 1 version ahead of your current version.
    • The recommended version was released 21 days ago, on 2022-11-14.
    Release notes
    Package name: mobx from mobx GitHub release notes

    Note: You are seeing this because you or someone else with access to this repository has authorized Snyk to open upgrade PRs.

    For more information:

    ?? View latest project report

    🛠 Adjust upgrade PR settings

    🔕 Ignore this dependency or unsubscribe from future upgrade PRs

Sarama is a Go library for Apache Kafka 0.8, and up.

sarama Sarama is an MIT-licensed Go client library for Apache Kafka version 0.8 (and later). Getting started API documentation and examples are availa

Jan 1, 2023
A quick introduction to how Apache Kafka works and differs from other messaging systems using an example application.
A quick introduction to how Apache Kafka works and differs from other messaging systems using an example application.

Apache Kafka in 6 minutes A quick introduction to how Apache Kafka works and differs from other messaging systems using an example application. In thi

Oct 27, 2021
Declare AMQP entities like queues, producers, and consumers in a declarative way. Can be used to work with RabbitMQ.

About This package provides an ability to encapsulate creation and configuration of RabbitMQ([AMQP])(https://www.amqp.org) entities like queues, excha

Dec 28, 2022
Confluent's Apache Kafka Golang client

Confluent's Golang Client for Apache KafkaTM confluent-kafka-go is Confluent's Golang client for Apache Kafka and the Confluent Platform. Features: Hi

Dec 30, 2022
Modern CLI for Apache Kafka, written in Go.
Modern CLI for Apache Kafka, written in Go.

Kaf Kafka CLI inspired by kubectl & docker Install Install from source: go get -u github.com/birdayz/kaf/cmd/kaf Install binary: curl https://raw.git

Dec 31, 2022
Cluster extensions for Sarama, the Go client library for Apache Kafka 0.9

Cluster extensions for Sarama, the Go client library for Apache Kafka 0.9 (and later).

Dec 28, 2022
CLI Tool to Stress Apache Kafka Clusters

Kafka Stress - Stress Test Tool for Kafka Clusters, Producers and Consumers Tunning Installation Docker docker pull fidelissauro/kafka-stress:latest d

Nov 13, 2022
franz-go - A complete Apache Kafka client written in Go

franz-go contains a feature complete, pure Go library for interacting with Kafka from 0.8.0 through 2.8.0+. Producing, consuming, transacting, administrating, etc.

Dec 29, 2022
Testing Apache Kafka using Go.

Apache Kafka Go Testing Apache Kafka using Go. Instructions Provision the single node Kafka cluster using Docker: docker-compose -p apache-kafka-go up

Dec 17, 2021
provider-kafka is a Crossplane Provider that is used to manage Kafka resources.

provider-kafka provider-kafka is a Crossplane Provider that is used to manage Kafka resources. Usage Create a provider secret containing a json like t

Oct 29, 2022
A CLI tool for interacting with Kafka through the Confluent Kafka Rest Proxy

kafkactl Table of contents kafkactl Table of contents Overview Build Development Overview kafkactl is a CLI tool to interact with Kafka through the Co

Nov 1, 2021
Kafka tool to emit tombstones for messages based on header value matches

Langolier Langolier is a CLI tool to consume a Kafka topic and emit tombstones for messages matched by header name/value pairs. Usage Usage of langoli

Sep 22, 2021
Service responsible for streaming Kafka messages.

kafka-stream ????‍♂️ Service responsible for streaming Kafka messages. What it does? This service reads all messages from the input topic and sends th

Oct 16, 2021
Kfchc - Kafka Connect (connectors / tasks) HealthCheck For AWS ALB and more

kfchc / Kafka Connect HealthCheck Kafka Connect (connectors / tasks) HealthCheck

Jan 1, 2022
Emits events in Go way, with wildcard, predicates, cancellation possibilities and many other good wins

Emitter The emitter package implements a channel-based pubsub pattern. The design goals are to use Golang concurrency model instead of flat callbacks

Jan 4, 2023
Tool for collect statistics from AMQP (RabbitMQ) broker. Good for cloud native service calculation.

amqp-statisticator Tool for collect statistics around your AMQP broker. For example RabbitMQ expose a lot information trought the management API, but

Dec 13, 2021
Apache Pulsar Go Client Library

Apache Pulsar Go Client Library A Go client library for the Apache Pulsar project. Goal This projects is developing a pure-Go client library for Pulsa

Jan 4, 2023
Implementation of the NELI leader election protocol for Go and Kafka
Implementation of the NELI leader election protocol for Go and Kafka

goNELI Implementation of the NELI leader election protocol for Go and Kafka. goNELI encapsulates the 'fast' variation of the protocol, running in excl

Dec 8, 2022
pubsub controller using kafka and base on sarama. Easy controll flow for actions streamming, event driven.

Psub helper for create system using kafka to streaming and events driven base. Install go get github.com/teng231/psub have 3 env variables for config

Sep 26, 2022