Containerization Docker for your automation platform (Docker + Selenium):
Contents
Hello friends, I am writing this blog after a long time. In the meanwhile, I was busy doing some R&D stuffs. Well of course one of the research topic among this was “How can we get to running our automation platform on a containerized environment?”. Well the first question that would definitely hit your mind would be “What does he mean to say by that?” and the second question would of course be “why?”. Well would have been my questions as well in the beginning. We would pipeline these questions along with its answers along with my blog. So, hold your horses, we are going step into the world of “Containerization Docker”.
First of all let me brief my viewers with the idea of “What containerization means?“.
What is Containerization?
Containerization definition: Containerization is a a technique that are being broadly used now a days to ease the process of deployment and have a similar platform for every one who are directly involved with the application. Containerization gives the developers the hassle free way of deploying and developing applications in a common platform that has the same specifications.
Like for example Developer A developing an application using JAVA-8 specifications and uses the hibernate framework of version 4.5.5. Wouldn’t it be better if Developer B and Developer C in the team also had the same specifications?? That would reduce the dependency of your application under development, on the host machine. Likewise when I am deploying my application on the cloud I know my application would have the same specific configuration that I used when developing in the host machine.
So, technically if I may put, app containerization is an operating system level virtualization technique that helps deploying and running distributed applications on the same virtual machine. Multiple isolated systems are run on a single control host and access a single kernel . The application containers hold the components such as files, environment variables and libraries necessary to run the desired software. Because resources are shared in this way, application containers can be created that place less strain on the overall resources available. Besides portability of the application also becomes much more easier.
For example if a variation from the standard image is required a container can be created which holds the new library for the same image. I hope I was able to clear my viewers with the idea of what containerization is?
Let’s jump in to understand “What Docker is?“
What is Docker?
Docker is a container management software. It helps in developing, running and shipping your application Docker enables us to separate the our application from the infrastructure so that we can deliver the software quickly. If you check in the global IT trends of today, Docker has become of immense importance when handling application deployment. The only reason behind it is simplicity of its implementation. The architecture of Docker is very simple and it is based just on a client server interaction. So, the guy or if I may say the brain of Docker is Solomon Hykes who has also been a co-founder and CEO of Dot cloud, the best competitor of a very well known cloud service provider that we know of, “Heroku” ;).
Now lets deep dive a bit into understanding the architecture and the functionality of docker. Docker basically uses the Docker-Engine to start of with the process of interacting with the containers and the mages.
Docker platform:
Docker provides the ability to package and run an application in a loosely isolated environment called a “container”. The isolation allows an user to run many containers simultaneously on a given host. Because of the lightweight nature of containers, which run without the extra load of a hypervisor, you can run more containers on a given hardware combination than if you were using virtual machines.
Docker Engine:
Docker Engine is a client-server application with these major components:
- A server which is a type of long-running program called a daemon process.
- A REST API which specifies interfaces that programs can use to talk to the daemon and instruct it what to do.
- A command line interface (CLI) client.
The CLI uses the Docker REST API to control or interact with the Docker daemon through scripting or direct CLI commands. Many other Docker applications use the underlying API and CLI.
The daemon creates and manages Docker objects, such as images, containers, networks, and data volumes. Here is how the structuring of docker has been done.
Docker Architecture:
Docker uses a client-server architecture. The Docker client talks to the Docker daemon, which does the heavy lifting job of building, running, and distributing the Docker containers. The Docker client and the daemon can run on the same system or it can connect a Docker client to a remote Docker daemon. The Docker client and daemon communicate using the REST API.
Docker daemon:
The Docker daemon runs on a host machine. The user uses the Docker client to interact with the daemon.
Docker client:
It is the primary interface to Docker calls and services. It accepts the commands from the user and communicates with the docker daemon.
Docker containers:
A Docker container is a runnable instance of a Docker image. You can run, start, stop, move, or delete a container using Docker API or CLI commands. When you run a container, you can provide configuration metadata such as networking information or environment variables.
Docker registries:
A docker registry is a library of images. A registry can be public or private, and can be on the same server as the Docker daemon or Docker client, or on a totally separate server.
Now with our concepts of docker clear we can now thrive to understand how can we implement the automation platform on docker.
Implementation:
I have a basic automation framework which executes the logging in and logging out process. I will share the project link later on. Now what would be my pre-requisites for executing the automation tests 1. The project should be a maven project with a properly implemented pom.xml with all the project dependencies in it. 2. The project should have a TestNG.xml file which would have all the test classes that needs to be executed. 3. The browser drivers which needs to be properly configured. Now a very important point to be remembered over here. The images would not be provided with a GUI. So we would just have the console where we would have to execute our test automation. We can either have xvfb installed in the Linux image or we can implement using phantomjs driver which would basically run your tests headless. Execute your test in the console using maven at first. Check the following image to refer.
Docker download as per your suitable environment follow the required documentation https://docs.docker.com/. Once docker has been installed follow the steps to checkout how can we start of with docker.
Step-1: To begin with we would want to have an image where we can ship our changes and containerize it. So, just type in docker run ubuntu. Once you do that it will try searching your image locally. If it does not get an image it will download docker an image from the “Docker registry“. Check the image to refer.
Once this is done you can check the images and the containers locally present in your host machine.
Step-2:Try running “docker ps -a” to get the list of all containers existing in your machine and running “docker images” will give you the list of all the images in the system. Check the image to refer
Step-3: Now what er need to do is configure a template which is basically called a “Dockerfile“. The Docker client uses this template to interact with the daemon process. Create a file in your project root folder named as “Dockerfile“.
In this Dockerfile
we are going to write down few lines of code to create a container from the existing image which would preferably consists steps to install required dependencies and update the existing system.
Here is the code snippet that you would be needing to run this entire automation suite. In my project I have the Dockerfile
inside the Docker
directory.
# Use the latest image from UBUNTU installed in the machine FROM ubuntu:latest MAINTAINER corefinder@docker.com # Update ubuntu system RUN apt-get update # Install java version on ubuntu-selenium image RUN apt-get install -y default-jdk RUN apt-get install -y default-jre # Install phantomjs RUN apt-get install -y phantomjs # Install maven on ubuntu-selenium image RUN apt-get install -y maven # Install git on ubuntu-selenium image RUN apt-get install -y git # Get the repository onto the local system RUN git clone https://:[email protected]/SoumyajitBasu1988/.git # Run the maven command to execute all the tests WORKDIR "/DockerSelenium" RUN mvn clean install test ENTRYPOINT ["/bin/bash"]
So, this dockerfile
would basically get the dependencies installed within the container which would include 1. git
2. maven
3. Java version 8
. After this it will take a clone of the required repository, navigate to the working directory and run the mvn clean test
which would execute your required test automation within the container. Check it out in the image below. I have used OAUTH to get a clone of the repository. You might set the ssh key in the container as well.
So, in the image above what I did was docker build by using the command docker build -t .
, -t represents tag name and . represents that the Dockerfile
is in the current working directory. So, once you build has been execute you can even access the container. Just type in docker run -it
. In the image below I just executed the command and it logged me into the container where I could directly run the tests.
So, I would come to the conclusion answering your question “Why would we need Docker?” I hope I was able to help you sort out the immense importance of containerization of your automation platform. To gist it in terms of deploying your automation platform.
- It becomes far more easy if you have to showcase your framework to the client side all you need to do is push your image into the docker hub and they can take a clone from it. It’s pretty simple to push your latest container to
docker hub
. It is very similar to using git All you need is just type in the following commands
2. As I said earlier as well portability becomes much easier. If you are running your code on the server you don’t have to worry about setting the entire automation platform. You can always build the container from the Dockerfile
.
3. Reducing dependency of the automation platform on the host machine.
That’s it folks for now. I hope in this docker tutorial I was able to deliver something that can be utilized when automating your application and by this you can also provide a newer direction to deploying your automation codebase.
Soumyajit is 5+ years experienced Software professional with his prime focus on automation technologies based on quality development and takes interest in the CI/CD processes. He provides help in developing the QA process in an organization with his skills in automation for the web platform. His focus is on improving the delivery process for an ongoing project and connects the dot to help out with a successful deployment. He has experience in working on analytics, e-commerce, and the ad-tech domain.
Besides being a professional he takes an immense interest in learning new skills and technologies. He is a research guide author/writer at Dzone and Web Code Geeks. He also maintains a blog platform of his own where he likes to keep up his technology junks.