10 min read

The most basic Continuous Integration process is called a commit pipeline. This classic phase, as its name says, starts with a commit (or push in Git) to the main repository and results in a report about the build success or failure. Since it runs after each change in the code, the build should take no more than 5 minutes and should consume a reasonable amount of resources.

This tutorial is an excerpt taken from the book, Continuous Delivery with Docker and Jenkins written by Rafał Leszko. This book provides steps to build applications on Docker files and integrate them with Jenkins using continuous delivery processes such as continuous integration, automated acceptance testing, and configuration management. In this article, you will learn how to create Continuous Integration commit pipeline using Docker.

The commit phase is always the starting point of the Continuous Delivery process, and it provides the most important feedback cycle in the development process, constant information if the code is in a healthy state.  A developer checks in the code to the repository, the Continuous Integration server detects the change, and the build starts. The most fundamental commit pipeline contains three stages:

  • Checkout: This stage downloads the source code from the repository
  • Compile: This stage compiles the source code
  • Unit test: This stage runs a suite of unit tests

Let’s create a sample project and see how to implement the commit pipeline.

This is an example of a pipeline for the project that uses technologies such as Git, Java, Gradle, and Spring Boot. Nevertheless, the same principles apply to any other technology.

Checkout

Checking out code from the repository is always the first operation in any pipeline. In order to see this, we need to have a repository. Then, we will be able to create a pipeline.

Creating a GitHub repository

Creating a repository on the GitHub server takes just a few steps:

  1. Go to the https://github.com/ page.
  2. Create an account if you don’t have one yet.
  3. Click on New repository.
  4. Give it a name, calculator.
  5. Tick Initialize this repository with a README.
  6. Click on Create repository.

Now, you should see the address of the repository, for example, https://github.com/leszko/calculator.git.

Creating a checkout stage

We can create a new pipeline called calculator and, as Pipeline script, put the code with a stage called Checkout:

pipeline {
     agent any
     stages {
          stage("Checkout") {
               steps {
                    git url: 'https://github.com/leszko/calculator.git'
               }
          }
     }
}

The pipeline can be executed on any of the agents, and its only step does nothing more than downloading code from the repository. We can click on Build Now and see if it was executed successfully.

Note that the Git toolkit needs to be installed on the node where the build is executed.

When we have the checkout, we’re ready for the second stage.

Compile

In order to compile a project, we need to:

  1. Create a project with the source code.
  2. Push it to the repository.
  3. Add the Compile stage to the pipeline.

Creating a Java Spring Boot project

Let’s create a very simple Java project using the Spring Boot framework built by Gradle.

Spring Boot is a Java framework that simplifies building enterprise applications. Gradle is a build automation system that is based on the concepts of Apache Maven.

The simplest way to create a Spring Boot project is to perform the following steps:

  1. Go to the http://start.spring.io/ page.
  2. Select Gradle project instead of Maven project (you can also leave Maven if you prefer it to Gradle).
  3. Fill Group and Artifact (for example, com.leszko and calculator).
  4. Add Web to Dependencies.
  5. Click on Generate Project.
  6. The generated skeleton project should be downloaded (the calculator.zip file).

The following screenshot presents the http://start.spring.io/ page:

Pushing code to GitHub

We will use the Git tool to perform the commit and push operations:

In order to run the git command, you need to have the Git toolkit installed (it can be downloaded from https://git-scm.com/downloads).

Let’s first clone the repository to the filesystem:

$ git clone https://github.com/leszko/calculator.git

Extract the project downloaded from http://start.spring.io/ into the directory created by Git.

If you prefer, you can import the project into IntelliJ, Eclipse, or your favorite IDE tool.

As a result, the calculator directory should have the following files:

$ ls -a
. .. build.gradle .git .gitignore gradle gradlew gradlew.bat README.md src

In order to perform the Gradle operations locally, you need to have Java JDK installed (in Ubuntu, you can do it by executing sudo apt-get install -y default-jdk).

We can compile the project locally using the following code:

$ ./gradlew compileJava

In the case of Maven, you can run ./mvnw compile. Both Gradle and Maven compile the Java classes located in the src directory.

You can find all possible Gradle instructions (for the Java project) at https://docs.gradle.org/current/userguide/java_plugin.html.

Now, we can commit and push to the GitHub repository:

$ git add .
$ git commit -m "Add Spring Boot skeleton"
$ git push -u origin master

After running the git push command, you will be prompted to enter the GitHub credentials (username and password).

The code is already in the GitHub repository. If you want to check it, you can go to the GitHub page and see the files.

Creating a compile stage

We can add a Compile stage to the pipeline using the following code:

stage("Compile") {
     steps {
          sh "./gradlew compileJava"
     }
}

Note that we used exactly the same command locally and in the Jenkins pipeline, which is a very good sign because the local development process is consistent with the Continuous Integration environment. After running the build, you should see two green boxes. You can also check that the project was compiled correctly in the console log.

Unit test

It’s time to add the last stage that is Unit test, which checks if our code does what we expect it to do. We have to:

  • Add the source code for the calculator logic
  • Write unit test for the code
  • Add a stage to execute the unit test

Creating business logic

The first version of the calculator will be able to add two numbers. Let’s add the business logic as a class in the src/main/java/com/leszko/calculator/Calculator.java file:

package com.leszko.calculator;
import org.springframework.stereotype.Service;

@Service
public class Calculator {
int sum(int a, int b) {
return a + b;
}
}

To execute the business logic, we also need to add the web service controller in a separate file src/main/java/com/leszko/calculator/CalculatorController.java:

package com.leszko.calculator;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;

@RestController
class CalculatorController {
@Autowired
private Calculator calculator;

@RequestMapping(“/sum”)
String sum(@RequestParam(“a”) Integer a,
@RequestParam(“b”) Integer b) {
return String.valueOf(calculator.sum(a, b));
}
}

This class exposes the business logic as a web service. We can run the application and see how it works:

$ ./gradlew bootRun

It should start our web service and we can check that it works by navigating to the browser and opening the page http://localhost:8080/sum?a=1&b=2. This should sum two numbers ( 1 and 2) and show 3 in the browser.

Writing a unit test

We already have the working application. How can we ensure that the logic works as expected? We have tried it once, but in order to know constantly, we need a unit test. In our case, it will be trivial, maybe even unnecessary; however, in real projects, unit tests can save from bugs and system failures.

Let’s create a unit test in the file src/test/java/com/leszko/calculator/CalculatorTest.java:

package com.leszko.calculator;
import org.junit.Test;
import static org.junit.Assert.assertEquals;

public class CalculatorTest {
private Calculator calculator = new Calculator();

@Test
public void testSum() {
assertEquals(5, calculator.sum(2, 3));
}
}

We can run the test locally using the ./gradlew test command. Then, let’s commit the code and push it to the repository:

$ git add .
$ git commit -m "Add sum logic, controller and unit test"
$ git push

Creating a unit test stage

Now, we can add a Unit test stage to the pipeline:

stage("Unit test") {
     steps {
          sh "./gradlew test"
     }
}

In the case of Maven, we would have to use ./mvnw test.

When we build the pipeline again, we should see three boxes, which means that we’ve completed the Continuous Integration pipeline:

Placing the pipeline definition inside Jenkinsfile

All the time, so far, we created the pipeline code directly in Jenkins. This is, however, not the only option. We can also put the pipeline definition inside a file called Jenkinsfile and commit it to the repository together with the source code. This method is even more consistent because the way your pipeline looks is strictly related to the project itself.

For example, if you don’t need the code compilation because your programming language is interpreted (and not compiled), then you won’t have the Compile stage. The tools you use also differ depending on the environment. We used Gradle/Maven because we’ve built the Java project; however, in the case of a project written in Python, you could use PyBuilder. It leads to the idea that the pipelines should be created by the same people who write the code, developers. Also, the pipeline definition should be put together with the code, in the repository.

This approach brings immediate benefits, as follows:

  • In case of Jenkins’ failure, the pipeline definition is not lost (because it’s stored in the code repository, not in Jenkins)
  • The history of the pipeline changes is stored
  • Pipeline changes go through the standard code development process (for example, they are subjected to code reviews)
  • Access to the pipeline changes is restricted exactly in the same way as the access to the source code

Creating Jenkinsfile

We can create the Jenkinsfile and push it to our GitHub repository. Its content is almost the same as the commit pipeline we wrote. The only difference is that the checkout stage becomes redundant because Jenkins has to checkout the code (together with Jenkinsfile) first and then read the pipeline structure (from Jenkinsfile). This is why Jenkins needs to know the repository address before it reads Jenkinsfile.

Let’s create a file called Jenkinsfile in the root directory of our project:

pipeline {
     agent any
     stages {
          stage("Compile") {
               steps {
                    sh "./gradlew compileJava"
               }
          }
          stage("Unit test") {
               steps {
                    sh "./gradlew test"
               }
          }
     }
}

We can now commit the added files and push to the GitHub repository:

$ git add .
$ git commit -m "Add sum Jenkinsfile"
$ git push

Running pipeline from Jenkinsfile

When Jenkinsfile is in the repository, then all we have to do is to open the pipeline configuration and in the Pipeline section:

  • Change Definition from Pipeline script to Pipeline script from SCM
  • Select Git in SCM
  • Put https://github.com/leszko/calculator.git in Repository URL

After saving, the build will always run from the current version of Jenkinsfile into the repository.

We have successfully created the first complete commit pipeline. It can be treated as a minimum viable product, and actually, in many cases, it’s sufficient as the Continuous Integration process. In the next sections, we will see what improvements can be done to make the commit pipeline even better.

To summarize, we covered some aspects of the Continuous Integration pipeline, which is always the first step for Continuous Delivery. If you’ve enjoyed reading this post, do check out the book,  Continuous Delivery with Docker and Jenkins to know more on how to deploy applications using Docker images and testing them with Jenkins.

Read Next

Gremlin makes chaos engineering with Docker easier with new container discovery feature

Docker faces public outcry as Docker for Mac and Windows can be downloaded only via Docker Store login

Is your Enterprise Measuring the Right DevOps Metrics?

A Data science fanatic. Loves to be updated with the tech happenings around the globe. Loves singing and composing songs. Believes in putting the art in smart.

LEAVE A REPLY

Please enter your comment!
Please enter your name here