1. Introduction

This is my Spring Cloud Data Flow labs which I developed by reading and modifying some tutorials that I read on the Internet (I let the references available).

When I develop my labs, I usually add some functions written in Bash to make them more fluent in its execution through a command line (inside a tmux session) on a macOS or Linux environment.

Currently, these are the labs that I developed:

Before start any of these labs, type the following command:

$ mkdir -p ~/tmp/spring-cloud-dataflow-labs && cd $_

2. Lab1 - Manual installation

Stop all other labs before start this.

2.1. Downloading Server Jars

$ wget -c https://repo.spring.io/release/org/springframework/cloud/spring-cloud-dataflow-server/2.7.1/spring-cloud-dataflow-server-2.7.1.jar
$ wget -c https://repo.spring.io/release/org/springframework/cloud/spring-cloud-skipper-server/2.6.1/spring-cloud-skipper-server-2.6.1.jar

2.2. Install Messaging Middleware

$ docker run -d --hostname rabbitmq --name rabbitmq -p 15672:15672 -p 5672:5672 rabbitmq:3.7.14-management
$ docker ps | grep rabbitmq

2.3. Starting Server Jars

$ tmux new-session 'java -jar spring-cloud-skipper-server-2.6.1.jar' \; \
split-pane 'java -jar spring-cloud-dataflow-server-2.7.1.jar' \; \
split-pane \; \
select-layout even-vertical
About the command line above:
  1. The first line will start a new tmux session and run Spring Cloud Skipper.

  2. The second line will start Spring Cloud Data Flow.

  3. The third line will start a new pane with a command prompt.

  4. The fourth line will organize the current tmux layout vertically.

2.4. Accessing Data Flow Dashboard

$ open http://localhost:9393/dashboard

2.5. Clean Up

Stop all the servers sartarted in the tmux session and also kill the rabitmq container instance with the following command:

$ docker kill rabbitmq

3. Lab2 - Installing by using Docker Compose

Stop all other labs before start this.

3.1. Starting quickly (for the impatient)

For the impatient, here is a quick start, single-line command:

$ wget -O docker-compose.yml \
https://raw.githubusercontent.com/spring-cloud/spring-cloud-dataflow/v2.7.1/spring-cloud-dataflow-server/docker-compose.yml; \
DATAFLOW_VERSION=2.7.1 SKIPPER_VERSION=2.6.1 \
docker-compose up
When you need to start it again, type:
$ DATAFLOW_VERSION=2.7.1 SKIPPER_VERSION=2.6.1 \
docker-compose up

Wait for the containers start and open the dashboard on your browser by accessing this URL:

Note the names of the created containers with the following command:

$ docker ps --format '{{.Names}}'

Expected output:

dataflow-server
dataflow-kafka
skipper
dataflow-kafka-zookeeper
dataflow-mysql

3.2. Docker Compose environment variables configuration

3.3. Stoping

$ export DATAFLOW_VERSION=2.7.1; export SKIPPER_VERSION=2.6.1
$ docker-compose down -v

3.4. Postgres Instead of MySQL

$ wget -O docker-compose-postgres.yml \
https://raw.githubusercontent.com/spring-cloud/spring-cloud-dataflow/v2.7.1/spring-cloud-dataflow-server/docker-compose-postgres.yml
$ docker-compose -f ./docker-compose.yml -f ./docker-compose-postgres.yml up -d
$ docker ps --format '{{.Names}}, {{.Status}}'

Expected output:

dataflow-server, Up 3 minutes
dataflow-kafka, Up 3 minutes
skipper, Up 3 minutes
dataflow-postgres, Up 3 minutes
dataflow-kafka-zookeeper, Up 3 minutes

4. Lab3 - Deploying with kubectl

Stop all other labs before start this.

4.1. Downloading

$ git clone https://github.com/spring-cloud/spring-cloud-dataflow && cd `basename $_`
$ git checkout v2.7.1

4.2. Deploying Kafka

$ kubectl create -f src/kubernetes/kafka/

Expected output:

statefulset.apps/kafka-broker created
service/kafka-broker created
deployment.apps/kafka-zk created
service/kafka-zk created
  1. Use kubectl get all -l app=kafka to verify that the deployment.

  2. Use kubectl delete all -l app=kafka to clean up afterwards.

4.3. Deploying Services, Skipper and Data Flow

4.3.1. Deploy MySQL