Introducing a GCP Pub/Sub Message Framework


Everything in either in GCP or AWS these days and there is so much data from so many sources!

To try to make sense of all the logs from different sources on GCP pub/sub, I created this little serverless framework that uses Kafka streams for alerting correlation on Kubernetes.

Installing Kubeless

Follow this instructions. Customize Kubeless config file at kubeless-config.yaml and then run:

$ make kl

Creating Kubeless topic

In Kafka, messages are published into topics. The functions ran by kubeless (consumers) are going to receive these messages by creating the topic:

$ kubeless topic create reactor

Firing Up Containers

To run a logstash, elastsearch, zookeeper and kafka (producers) so that it outputs to Kafka's topic for kubeless, run:

$ make pipeline


To debug any pods (kubeless or kafka or zoo), grab the name with:

$ make pods

and then run:

$ kubectl logs <podname>  --namespace=kubeless


Enjoy and let me know what you think! :)

PS: If you want to learn more about GCP, check my resources and labs here.