Setting up CI/CD for GKE with Jenkins

  • Malavika
  • April 13, 2020

Jenkins is an open source tool for Continuous Integration and Deployment that makes building and testing your applications easier and faster with automation.

The reason for this article is to demo the setup of a build job on your existing Jenkins machine which deploys to a Google Kubernetes Engine (GKE) cluster without having to setup another machine on GKE.


Google Kubernetes Engine is a cluster orchestration and management service by Google to help deploy and scale your Docker containers and container clusters running on Google Cloud Services. This is based on Google’s open source container management system Kubernetes and it allows you to interact with your cluster which inherently consists of a group of Compute Instances. A cluster can have multiple node pools where each node pool contains multiple nodes. Furthermore, a node is capable of serving multiple workloads and each of these workloads contain multiple pods. A pod contains the container which runs your build image. Here’s a handy comic to make you understand some of the core concepts of GKE and the features it offers.


  • GKE Cluster running on any region
  • Jenkins server
  • A Docker Image in Google Container Registry (GCR)

Prepare Jenkins Machine

  • Login to GCP and create a service account with the following permissions and download the JSON credentials:
    – Cloudbuild Admin or Cloudbuild Service Account
    – Kubernetes Engine Admin/Service Account
    – Storage Admin
    – Cloud Run Admin
  • SSH into your Jenkins Server and login as jenkins user. If the user is not present, you can add it by using:

adduser jenkins sudo
sudo passwd jenkins #If no password set for jenkins user
sudo -u jenkins /bin/bash #to switch to jenkins user


If you have trouble using these commands without sudo from Jenkins user,
refer to Step 2 of the link above on how to execute the docker command without sudo.

GCloud Configuration

  • Set project

gcloud config set project <project-name>

  • Activate the service account created above:

gcloud auth activate-service-account account_name --key-file [KEY_FILE]

The key file is the JSON file downloaded after the creation of service account.

  • To connect to the cluster, ~/.kube should have a config YAML for the cluster. For this, go to Kuberenetes cluster, click connect and copy the command, execute it from Jenkins user:

gcloud container clusters get-credentials <cluster-name> --zone <zone-name> --project <project-name>

Verify that the credentials are working by running any kubectl command, eg, kubectl get deployment

  • To register gcloud as the docker credential helper:

gcloud auth configure-docker

Kubernetes Setup

This step can be skipped if you are using an existing cluster with a deployment running.

Cluster Creation

  • Go to GCP and search for Kubernetes Engine on the left side panel. In the opened view, click on Clusters from the sidebar and click create new.
  • In create view, select an appropriate name for your cluster and from the left panel, select node pool and add machine configuration. You can enable autoscaling, select minimum/maximum number of nodes to scale, and limit maximum number of pods in one node.
  • In the Nodes tab, select the machine configuration based on your usage. These node settings act as a template which will be used when new nodes are created using the same node pool. Maximum boot disk size at the time of writing this is 100 GB, which is permanent.
  • Also set the maximum pods per node in the Networking tab. By default this is set to 8.
  • Hit Create.

Creating Deployment

  • In the Kubernetes Engine side panel, click on Workload which is just beneath the Clusters option. Click deploy.
  • Add your existing GCR image
  • Configure the deployment/application name (DO NOT USE “_” in the deployment name) and click deploy.

Deployment Actions

Autoscale Set autoscale settings here, minimum required, maximum to scale to (pods) and custom triggers for scaling.

Scale Manually scale number of pods

Expose Exposes the service on a port

Rolling update : Updates an existing deployment with a new image

Autoscale :Set autoscale settings here, minimum required, maximum to scale to (pods) and custom triggers for scaling

For this setup, we’re going with the rolling update assuming that you already have a deployment in place which needs to be replaced with a new image.

Jenkins Build Job

  • Go to Jenkins and click on New Item on the left side and create a new freestyle project.
  • In the build step, select Execute Shell option and add:
# cd into working directory with DockerFile
cd <work-dir> 

# To activate creds for the first time, can also be done in Jenkins machine directly and get credentials for kubectl 
gcloud auth activate-service-account account_name --key-file [KEY_FILE]

# Get credentials for cluster
gcloud container clusters get-credentials <cluster-name> --zone <zone-name> --project <project-name>

# Build Image
gcloud builds submit --tag<project-name>/nginx:${version}

# To create new deployment 
kubectl create deployment <deployment-name><project-name>/nginx:${version}

# For rolling update
kubectl set image deployment/<app_name><project-name>/<appname>/nginx:${version} --record


  • You can use Jenkins ${BUILD_NUMBER} variable for incremental tagging of deployments.
  • Same build tag cannot be used to rebuild an image, this will result in Jenkins build failure.
  • Docker needs to be up and running on jenkins machine before triggering this job.
  • This deployment is a rolling update, so a deployment with that name should exist before triggering the job. For more on rolling updates:
  • For a manual rollback, you can parameterize version in build job and use a previous version.
  • To check logs for the running instance, go to GKE > Workloads > Workload name and select a pod. Click on View logs.


This demonstrates a simple CI/CD workflow with Jenkins, Docker and GKE. The main benefit of this is the flexibility it provides in terms of extending it for more images depending on your development needs without having to setup a machine on GKE cluster.

Thanks for reading! Don’t forget to follow us on Twitter or reach out to us at

  • Tags:
  • AI
  • DevOps
  • GKE
  • Jenkins
  • Tech
You might also Like
Analyze Earnings Call using Bewgle’s NLP platform

Analyze Earnings Call using Bewgle’s NLP platform

  • Kshitija Ambulgekar
  • December 23, 2022

At Bewgle we apply our NLP capabilities on any unstructured data, using our patented AI models, to generate actionable insights from it. Bewgle’s Natural language processing, machine learning models, fundamentally analyze the text and output the answers to questions, the insights, topics, sentiment, adjectives and other key features that we promise to our customers. Here … Continue reading "Analyze Earnings Call using Bewgle’s NLP platform"

News Articles Analysis: How to get Insights using BEWGLE’s NLP Platform

News Articles Analysis: How to get Insights using BEWGLE’s NLP Platform

  • Vivek Hegde
  • November 10, 2022

Analyzing news articles can be helpful in drawing insights into how online platforms/news agencies are portraying a brand/product(s). So it’s essential information for any brand to understand and act upon. Keeping up to date with new trends/innovations/launches etc can be hard as it involves going over multiple articles on a daily basis. What if we … Continue reading "News Articles Analysis: How to get Insights using BEWGLE’s NLP Platform"

Lowering Cholesterol Naturally – Through Bewgle Lens

Lowering Cholesterol Naturally – Through Bewgle Lens

  • Swati Agarwal
  • August 2, 2022

At Bewgle, we take immense pride in our NLP capabilities on any unstructured data. Though we have primarily focused on drawing insights from feedback or similar text, I, as a data enthusiast, wanted to challenge the system beyond feedback. One of the use cases that I wanted to try was that of deriving insights from … Continue reading "Lowering Cholesterol Naturally – Through Bewgle Lens"