Using DSL in Jenkins for DevOps Automation

Source: Google Images (Jenkins, Kubernetes)

Jenkins is a continuous integration tool that allows us to automate the working of the development and deployment cycle of the final applications for our clients. In some of my previous articles, I have shown how Jenkins can be integrated with other tools of DevOps like Git, GitHub, Kubernetes, etc. In this article, we will be seeing another aspect of Jenkins, which is the use of Domain Specific Language (DSL).

In Jenkins, we can use the web UI provided to create jobs, pipeline views, send emails, etc. But, it is also possible to use other plugins in Jenkins to write down the configurations of the Jenkins jobs as a single script. The execution of this script can then help us set up all the jobs and environments we need, without individually creating them one by one. In Jenkins, the script is written in the Groovy language.

To create multiple jobs using a script, we first need to create a seed job, which will run the script we write and then create all the required jobs and environments.

In this article, we will be writing a script that creates a chain of jobs. The main objective of the job chain is to set up a testing environment, where the code to be deployed is tested. If the code is functioning, an email must be sent to the developer. Finally, a build pipeline view of the chain of jobs must be created.

Before we begin to write the script, we need the following pre-requisites:

  1. Installation of Jenkins (I will be using Jenkins installed on a RHEL8 VM)
  2. Installation of the Jenkins Job DSL Plugin.

Writing the script

The script is written in the order of the jobs that must be created. Jenkins itself offers a clear list of all the different features available when writing the DSL script (Reference 1). The contents of the DSL script are given below.

# DSL script for Job 1job("DevOps_Task6_Job1") {
description ("Job to pull code from GitHub repository")
scm{
github('akshayavb99/DevOps_Task6','master')
}
triggers {
scm ("* * * * *")
}
steps {
shell('''
sudo mkdir -p /root/devops_task6
sudo cp -rvf * /root/devops_task6
''')

remoteShell('root@192.168.1.26:22') {
command('''
if kubectl get pvc | grep html-pvc
then
echo "HTTPD PVC already created"
else
kubectl create -f /root/devops_task6/httpd-pvc.yml
kubectl get pvc
fi
''')

remoteShell('root@192.168.1.26:22') {
command('''
if kubectl get deploy | grep html-dp
then
echo "HTTPD Pods are running"
else
kubectl create -f /root/devops_task6/httpd-server.yml
kubectl get svc
fi
''')
}

}
}

}

Job 1 is written to create a folder to store all the files being downloaded from the GitHub repository. Job 1 also tests if the Persistent Volume Claim (PVC) for the testing deployment of Kubernetes and the deployment itself has been created. If not, both these resources are created using the kubectl command.

# DSL script for Job 2job("DevOps_Task6_Job2") {
description ("Job to shift code into testing environment")
triggers {
upstream('DevOps_Task6_Job1', 'SUCCESS')
}
steps {
remoteShell('root@192.168.1.26:22') {
command('''
html_pods=$(kubectl get pods -l 'app in (html-dp)' -o jsonpath="{.items[0].metadata.name}")
echo $html_pods
kubectl cp /root/devops_task6/index.html "$html_pods":/usr/local/apache2/htdocs
''')
}

}
}

Job 2 is a downstream project of Job 1 and it is built only if Job 1 is built successfully. It copies the file containing the application code into each of the pods of the testing deployment.

# DSL Script for Job 3job("DevOps_Task6_Job3") {
description ("Testing the code")
triggers {
upstream('DevOps_Task6_Job2', 'SUCCESS')
}
steps {
remoteShell('root@192.168.1.26:22') {
command('''
status=$(curl -o /dev/null -sw "%{http_code}" http://192.168.99.103:30909)
if [ $status -eq 200 ]
then
echo "Page is working well"
exit 0
else
echo "Page is not working well"
exit 1
fi
''')

}

}

publishers {
extendedEmail {
recipientList('<Your Email>')
defaultSubject('Build failed')
defaultContent('Error in build or HTML Page')
contentType('text/html')
triggers {
failure{
attachBuildLog(true)
subject('Failed build')
content('The build was failed')
sendTo {
developers()

}
}
}
}
}
}

Job 3 is a downstream project of Job 2. It has two main functions:

a) It is meant to test the code. If the code is functioning, then the build is successful, else the job build is considered a failure.

b) If the job build results in failure, an email is to be sent to the developers informing them of the build result and attach the logs to the email sent to the developer. (This requires the Email Extension Plugin to be installed.)

#DSL Script for Build Pipeline View(Requires Build Pipeline Plugin)buildPipelineView('DevOps_Task6') {
title('DevOps_Task6')
displayedBuilds(3)
selectedJob('DevOps_Task6_Job1')
showPipelineParameters(true)
refreshFrequency(3)
}

Finally, we have the script for the Build Pipeline view of the job chain we have created, starting from Job 1.

The script is written as a whole in the Jenkins seed job. It can be given in two ways:

a) It can be entered directly in the space given for the DSL script in the build section.

b) It can be loaded from the file system.

Before, running the seed job, we also need to remove certain security measures which, by default, will not allow us to run seed jobs and create job chains without approval every time we build the seed job. To remove this security feature, we go to the Configure Global Security under Manage Jenkins and disable the following feature.

Figure 1: Disabling Script Security for Job DSL Scripts

Now, we can set up the Seed Job as shown below.

Figure 2: Seed Job Configuration (Part 1)
Figure 3: Seed Job Configuration (Part 2)

The complete DSL script is also given in the GitHub link at the end of the article for future reference.

Once the seed job is run successfully, we see the following output.

Figure 4: After the successful build of the seed job

You can also go check the Jenkins dashboard to see all the new jobs being created. If the jobs have been built successfully, we get the following pipeline view.

Figure 5: Build Pipeline View of the job chain

References

  1. Jenkins Job DSL Plugin API: https://jenkinsci.github.io/job-dsl-plugin/
  2. GitHub link to all the files used: https://github.com/akshayavb99/DevOps_Task6

This article is written as a part of the DevOps Assembly Lines Training Program conducted by Mr. Vimal Daga from LinuxWorld Informatics Pvt., Ltd.

Check out more of my work in the field of DevOps below!

  1. Working with Jenkins — An Introduction: https://medium.com/@akshayavb99/working-with-jenkins-an-introduction-48ecf3de3c25
  2. Working with Jenkins, Docker, Git, and GitHub — Part II: https://medium.com/@akshayavb99/working-with-jenkins-docker-git-and-github-part-ii-d74b6e47140c
  3. Working with Jenkins, Docker, Git, and GitHub — Part III: https://medium.com/@akshayavb99/working-with-jenkins-docker-github-and-kubernetes-part-iii-72deae79bf2e
  4. Working with Dynamic Jenkins Clusters: https://medium.com/@akshayavb99/working-with-dynamic-jenkins-clusters-part-iv-3d925baca0f9
  5. Working with Prometheus and Grafana — An Introduction: https://medium.com/@akshayavb99/working-with-prometheus-and-grafana-an-introduction-ae197c8e27bf

ECE Undergrad | ML, AI and Data Science Enthusiast | Avid Reader | Keen to explore different domains in Computer Science