Dockerise selenium-python tests and robustly spin-up docker containers for Sequential or Parallel Test Executions.
The objective of this post is to explain, how to build an image of your selenium project source code and spin up containers dynamically for a specific/cross-browsers and discard the container soon after the individual test executions.
Benefits of Introducing Docker in Selenium Tests.:
- Parallel tests execution.
- Cross-browser testing.
- Can execute the selenium tests in TFS CI Release pipeline with minimal changes(if required)
Before we start with the design, i have taken some assumptions listed below.
- Ubuntu (16.04)+
- Docker is already installed(18.09)
- Proficiency in Python-3
- Knowledge over selenium Grid concept.
- Selenium-Python project using pytest framework created(Used Remote webdriver to perform test executions.)
- Used GIT — So that you can check out your code base.
- Knowledge on Python-Flask, to create Rest-API’s
Let me brief about Docker selenium and then we can jump into the designing aspects of the workflow.
It uses the same architecture of selenium grid but each component- hub and nodes are a separate containers.
- Docker is highly used by test teams to run their tests in parallel on Selenium Grid.
- You can incorporate this approach in your Linux, Windows machine or at TFS CI-CD pipeline.
Advantages of Docker Selenium
- To get rid of all the dependencies.
- It is lightweight.
- You can pull the docker images from the docker repository or you can place the .tar files of the docker images at “Artifactory” and pull it from there. In a way you can cut down the dependency from the external sources.
I would recommend you to use a dedicated Linux(Ubuntu) machine to have this configured.
Let’s get started!!!
Design a Workflow to perform the automated test script execution in Parallel
- Our First step is to identify the availability of the Selenium Project(code base)docker image in the docker registry.
- If the Project image is found that it should be deleted, as you need to have updated source code to test on any given day.
- Pull the Selenium hub /Chrome -Node-Debug docker images from the docker hub (If necessary) since we will be using docker-compose file to create containers, it will auto-download the required images if not available at your docker registry. You can refer this.
- Checkout the latest code base from GIT, into a predefined location.
- Introducing Flask Rest-API, to spin-up containers (Selenium hub,chrome node debug/other browsers) only on request.
- Build a docker image of your selenium code base and provide the entry point(command) in the dockerfile, so that you can start your test executions right from the image.
- Based on the test scripts need, the target browser can be identified and make a request to the ‘Container_Manager’, for an environment to execute the test script.
- At the end of each test execution, containers should be destroyed since it is no longer needed. → Clean Up Mechanism
“Please note, python scripts shall suffice to perform all the operations listed in the design diagram.”
Hint: A python package ‘Docker’ can be used to access all the docker images/containers available in your registry. For more details you can refer this.
Let us start with the Container Manager(refer diagram)
a. Container Manager — This is a customized docker image built on flask Rest-API, which can be used to create chrome-node-debug & Selenium hub containers (On Request)
First, we will build a image for Container Manager.
Create a Flask application with Rest-API’s,
Python Packages required : Flask==1.0.2, flask_restful
- To Create Containers:
a. GET request : Create containers whenever there is a request and response(acknowledge) as port-id, so that test scripts can use the containers for test execution.
i. Create a template of docker-compose.yml file,
ii. start creating methods for the following viewpoints.
a. Generate a random 4 digit no.
b. Generate 2 sets of random 12 char length alphanumeric values.
c. Create a directory with a name as 4 digit no.(step a)
d. Copy the docker-compose file template into the directory and,
d.1. Replace PORT_ID with the same 4 digit no.
d.2. Replace ‘SELENIUM_HUB_NAME’ with 12 char length alphanumeric values.
d.3. Replace ‘CONTAINER_CHROME’ with 12 char length alphanumeric values.
d.4. Navigate to compose file path, execute the cmd: docker-compose up -d
This will start creating the container for both selenium hub & chrome-node-debug.
2. To Delete Containers → Clean up operation :
b. POST request : To Delete the container, parse the port-id as a input in the request URL. (which was acknowledged during the test execution.)
When POST request is made, use the request.args (which is a port-id), call a method to locate the folder (which is same as port-id) and read the ‘container_name’ from the compose file and delete the containers accordingly(To delete the containers, search the containers by Name or you can execute docker commands “docker-compose stop & docker-compose down”).
Now the Flask app is ready, let us create a dockerfile for it.
Provide the dependency python packages in ‘requirements.txt’ file.
(note: you can provide any Image name & tag)
To build an image, execute the following command:
“docker build -t container_manager:latest .”
“Now comes the important part, you need to create containers within a container. confusing??? OK let me elaborate, have a look at the design diagram, when a component is asking for a resource i.e., new containers(selenium hub and Chrome-node-debug) to a docker image(i.e., flask app) you need to have access to the docker, apparently which you don’t have.”
There are 2 options,
Option 1 : You should have access to the localhost docker.sock file
Option 2: Install docker ce in the ‘Container_Manager” image and then execute docker run commands.
I will go with option 1, execute the following command:
“ docker run -v /var/run/docker.sock:/var/run/docker.sock -v /usr/bin/docker:/usr/bin/docker -d -p 5000:5000 container_manager”
Now a container is created for ‘Container_Manager’, you can check by listing the active containers, docker ps -a
Now you can access the Rest-API end points using port ‘5000’, as 5000 port is exposed in the docker run command.
Now you can make a request to the ‘Container Manager’, For eg:
URL : localhost:5000/api/v1/create-container
“api/v1/create-container” is the end point which i have created
Type of Request : GET
Response: port-id (4446)
4446 can be accessed by the external world. It means we can see the number of browsers running on Selenium-hub by hitting URL “localhost:4446/grid/console”. And you'll be able to see the dashboard like the screenshot given below.
Note: You also need to incorporate the changes at your selenium source code, to request for the container port details from Container_Manager using the desired end points created.
And Vice versa for Deleting the container as well.
Now let’s start building an Image for your project.
Pull the latest selenium source code from the GIT repository and start building the docker image out of it.
Steps to build a docker image for the selenium source code is illustrated below,
Use the above listed commands to create a ‘Dockerfile’ and update <Project_Dir>/<Package_Dir> python project and package directory details
The dockerfile does the following activities ,
1. Update the system.
2. Install Python-3.
3. Create the project directory.
4. Copy the project directory into the new directory created.
5. Set the New directory path as “PYTHONPATH”, so that project path can be discover able from the command line.
6. Install the python dependency packages to support the test script execution.
7. Entry-point — To start the test executions.
As i mentioned in the beginning, i would be using pytest framework.
for Entry point, you can use the following command:
CMD [“python3”, “pytest -v -n <no of distributed test execution>-m <Test_category_to execute> <Test_root _folder>]
- n denotes the “Distributed test execution”
- m denotes “Discover the test scripts from the test root folder”
- Test_category_to execute — If you have marked the test scripts as ‘Smoke/Sanity/Regression’ etc,you can mention the Keywords
There are other possible ways to execute the test scripts. For more information, refer this.
Now let us build a Docker Image
docker build -t <docker_image_name>:<tag> <docker file path>
navigate to the docker file path and execute the below command,
docker build -t <docker_image_name>:<tag> .
Note: If you have any issues while building an image, please make sure the proxy’s are set in the dockerfile.
Now that the image is built successfully, your executions are also In-Progress as the entry point is mentioned to start your test executions.
I hope this article gave you some good insight into how you can quickly set up end to end tests for a complex system. Multiple components in multiple repositories should not be a barrier. Docker compose makes it easy to put things together.
End to end tests are the best way to avoid crunch time. In complex systems, late delivery of some components puts a burden on other teams. Integrations are done in a rush. Code quality drops. That’s a vicious circle.
Selenium tests can be done quick and dirty to get going fast. That is perfectly OK. Automate things. Then improve. Remember:
Done is better than perfect any day of the year.
That’s all folks! Thank you so much for reading, and please feel free to follow up with any questions. See you next time!!!
Posting more on Design topics soon.
Keep Learning, keep coding.