Study project that uses Apache Kafka as syncing mechanism between two databases, with producers and consumers written in Go.

Kafka DB Sync

Study project that uses Apache Kafka as syncing mechanisms between a monolith DB and a microservice.

The main purpose of this project is to find out a solution to break up a monolith DB (usually a legacy DB) into a microservice, considering the fact that the monolith DB will still be used until further notice (for instance, development of other microservices, refactoring of integrations, and so on...).

Using Sakila sample as the monolith DB, I created a microservice that is responsible to deal only with the Catalogue context. This microservice is a REST API and, when users send out a new entry for the films catalogue, for instance, we should store this entry in our microservice DB and sync it with the monolith DB. The same is valid for the other way around, I mean, when the monolith DB is updated, we should sync it with our microservice DB.

To accomplish that, as mentioned before, I used Apache Kafka as event stream platform to keep the loose coupling between the microservice and the legacy DBs. I also used a Kafka connector to receive continuously any changes in the legacy DB. For databases I used MySQL and the publishers, consumers and the Rest API were written in Go.

Important notes

  • For the sake of simplicity I didn't implement deletes, only updates and inserts, and the code was written as simple as possible;
  • I created a UUID field in Film's monolith DB to keep some relation between the rows in both databases;
  • The sync between the databases is near real-time;

How to run

  • make run
  • make create_source_connector

How to test

  • Insert a film into catalogue: curl -i -X POST http://localhost:8080/api/v1/catalogue -H "Content-Type: application/json" -d '{"title": "The Sixth Sense", "year": 1999}'
  • You can check if the event was correctly sent to Kafka, you can access http://localhost:9000 and check the catalogue topic
  • In both databases you should be able to see the film created
  • Perform some change in the Title or Year film's columns from legacy DB and check if these changes are sync: curl -i -X GET http://localhost:8080/api/v1/catalogue/711a38b0-038a-49c9-a27c-f6780c2b649d
  • Update the film using REST API curl -i -X PUT http://localhost:8080/api/v1/catalogue/711a38b0-038a-49c9-a27c-f6780c2b649d -H "Content-Type: application/json" -d '{"title": "The Sixth Sense", "year": 2021}'
  • Keep playing =)
Similar Resources

A CLI tool for interacting with Kafka through the Confluent Kafka Rest Proxy

kafkactl Table of contents kafkactl Table of contents Overview Build Development Overview kafkactl is a CLI tool to interact with Kafka through the Co

Nov 1, 2021

GoLang + Kafka example project

Golang Kafka Example Sample Golang Kafka Consumer and Producer Setup Apache Kafka Quickstart Producer go run cmd/producer/main.go Consumer flags: brok

Nov 9, 2021

Apache Pulsar Go Client Library

Apache Pulsar Go Client Library A Go client library for the Apache Pulsar project. Goal This projects is developing a pure-Go client library for Pulsa

Jan 4, 2023

Implementation of the NELI leader election protocol for Go and Kafka

Implementation of the NELI leader election protocol for Go and Kafka

goNELI Implementation of the NELI leader election protocol for Go and Kafka. goNELI encapsulates the 'fast' variation of the protocol, running in excl

Dec 8, 2022

pubsub controller using kafka and base on sarama. Easy controll flow for actions streamming, event driven.

Psub helper for create system using kafka to streaming and events driven base. Install go get github.com/teng231/psub have 3 env variables for config

Sep 26, 2022

Kafka producer and consumer tool in protobuf format.

protokaf Kafka producer and consumer tool in protobuf format. Features Consume and produce messages using Protobuf protocol Trace messages with Jaeger

Nov 15, 2022

ChizBroker is a fast and simple GRPC based implementation of kafka.

ChizBroker is a fast and simple GRPC based implementation of kafka.

Chiz Broker: a broker for fun ChizBroker is a fast and simple GRPC based implementation of kafka. Features: Ready to be deployed on kubernetes Prometh

Oct 30, 2022

Producer x Consumer example using Kafka library and Go.

Go - Kafka - Example Apache Kafka Commands First of all, run the docker-compose docker-compose up, than run docker exec -it kafka_kafka_1 bash Topics

Dec 8, 2021

implentacion queue in kafka, rabbit and sqs

Big Queue on Go This is a simple big queue and implementation in kafka, rabbit and aws sqs. Publish in a topic in kafka: Use NewPublisher method to cr

Dec 29, 2021
Apache Kafka Web UI for exploring messages, consumers, configurations and more with a focus on a good UI & UX.
Apache Kafka Web UI for exploring messages, consumers, configurations and more with a focus on a good UI & UX.

Kowl - Apache Kafka Web UI Kowl (previously known as Kafka Owl) is a web application that helps you to explore messages in your Apache Kafka cluster a

Jan 3, 2023
Modern CLI for Apache Kafka, written in Go.
Modern CLI for Apache Kafka, written in Go.

Kaf Kafka CLI inspired by kubectl & docker Install Install from source: go get -u github.com/birdayz/kaf/cmd/kaf Install binary: curl https://raw.git

Dec 31, 2022
franz-go - A complete Apache Kafka client written in Go

franz-go contains a feature complete, pure Go library for interacting with Kafka from 0.8.0 through 2.8.0+. Producing, consuming, transacting, administrating, etc.

Dec 29, 2022
Sarama is a Go library for Apache Kafka 0.8, and up.

sarama Sarama is an MIT-licensed Go client library for Apache Kafka version 0.8 (and later). Getting started API documentation and examples are availa

Jan 1, 2023
A quick introduction to how Apache Kafka works and differs from other messaging systems using an example application.
A quick introduction to how Apache Kafka works and differs from other messaging systems using an example application.

Apache Kafka in 6 minutes A quick introduction to how Apache Kafka works and differs from other messaging systems using an example application. In thi

Oct 27, 2021
Confluent's Apache Kafka Golang client

Confluent's Golang Client for Apache KafkaTM confluent-kafka-go is Confluent's Golang client for Apache Kafka and the Confluent Platform. Features: Hi

Dec 30, 2022
Cluster extensions for Sarama, the Go client library for Apache Kafka 0.9

Cluster extensions for Sarama, the Go client library for Apache Kafka 0.9 (and later).

Dec 28, 2022
CLI Tool to Stress Apache Kafka Clusters

Kafka Stress - Stress Test Tool for Kafka Clusters, Producers and Consumers Tunning Installation Docker docker pull fidelissauro/kafka-stress:latest d

Nov 13, 2022
Testing Apache Kafka using Go.

Apache Kafka Go Testing Apache Kafka using Go. Instructions Provision the single node Kafka cluster using Docker: docker-compose -p apache-kafka-go up

Dec 17, 2021
provider-kafka is a Crossplane Provider that is used to manage Kafka resources.

provider-kafka provider-kafka is a Crossplane Provider that is used to manage Kafka resources. Usage Create a provider secret containing a json like t

Oct 29, 2022