Monday, March 29, 2021

What is Docker - Important to know about Docker for SDET


                             


Docker is a set of platform as a service (PaaS) products that use OS-level virtualization to deliver software in packages called containers. 
Containers are isolated from one another and bundle their own software, libraries and configuration files; they can communicate with each other through well-defined channels. Because all of the containers share the services of a single operating system kernel, they use fewer resources than virtual machines.


What is Docker?
Docker creates simple tooling and a universal packaging approach that bundles up all application dependencies inside a container which is then run on Docker Engine.

Docker Engine enables containerized applications to run anywhere consistently on any infrastructure, solving “dependency hell” for developers and operations teams, and eliminating the “it works on my laptop!” problem.

Docker is a set of platform as a service products that use OS-level virtualization to deliver software in packages called containers.

Package Software into Standardized Units for Development, Shipment and Deployment


What is Container?
A container is a standard unit of software that packages up code and all its dependencies so the application runs quickly and reliably from one computing environment to another. A Docker container image is a lightweight, standalone, executable package of software that includes everything needed to run an application: code, runtime, system tools, system libraries and settings.

Container images become containers at runtime and in the case of Docker containers - images become containers when they run on Docker Engine. Available for both Linux and Windows-based applications, containerized software will always run the same, regardless of the infrastructure. Containers isolate software from its environment and ensure that it works uniformly despite differences for instance between development and staging.

Docker containers that run on Docker Engine:
Standard: Docker created the industry standard for containers, so they could be portable anywhere
Lightweight: Containers share the machine’s OS system kernel and therefore do not require an OS per application, driving higher server efficiencies and reducing server and licensing costs
Secure: Applications are safer in containers and Docker provides the strongest default isolation capabilities in the industry


Comparing Containers and Virtual Machines
Containers and virtual machines have similar resource isolation and allocation benefits, but function differently because containers virtualize the operating system instead of hardware. Containers are more portable and efficient.



CONTAINERS:
Containers are an abstraction at the app layer that packages code and dependencies together. Multiple containers can run on the same machine and share the OS kernel with other containers, each running as isolated processes in user space. Containers take up less space than VMs (container images are typically tens of MBs in size), can handle more applications and require fewer VMs and Operating systems.


VIRTUAL MACHINES:
Virtual machines (VMs) are an abstraction of physical hardware turning one server into many servers. The hypervisor allows multiple VMs to run on a single machine. Each VM includes a full copy of an operating system, the application, necessary binaries and libraries - taking up tens of GBs. VMs can also be slow to boot.



















Sunday, March 28, 2021

Create Jenkins job using groovy script (Jenkins file)


Create Jenkins job using groovy script (Jenkins file)

Inside your project create a new file with name Jenkinsfile with below content:




pipeline {
    agent any
    stages {
        stage ('Compile Stage') {

            steps {
                withMaven(maven : 'apache-maven-3.6.3') {
                    bat 'mvn clean compile'
                }
            }
        }
        stage ('Testing Stage') {

            steps {
                withMaven(maven : 'apache-maven-3.6.3') {
                    bat 'mvn test'
                }
            }
        }
        stage ('Install Stage') {
            steps {
                withMaven(maven : 'apache-maven-3.6.3') {
                    bat 'mvn install'
                }
            }
        }
    }
}

Push the code changes in Git.

Make sure maven name given in Jenkins Global Configuration should match with the Maven name in Jenkinsfile.




So, as you see in below screenshot, We have used the same maven name in Jenkins file also.



Now create a new Jenkins Pipeline Job, ensure that you have "Pipeline" plugin and "Pipeline Maven Integration" plugin is installed.

Job will look as below, Save the job and build, this will search for the Jenkinsfile and will perform all the stages which we have defined inside.

Once the execution is over, it will show all stages and so it becomes so easy for us for debugging in- case of any failures. This is all about pipeline.




Click on save.

And execute the job.




How to Integrate Jenkins with GitHub?

 

How to Integrate Jenkins with GitHub?

How do I trigger a build automatically in Jenkins?


1. Click on New item

2. Enter a project name

3. Click on OK

4. Go to ‘Source Code Management’





5. Go to ‘Build Trigger’

    a. Select ‘GitHub hook trigger for GITScm polling’ 
        Because job will listen from Git hub webhooks




6. Now go to Git repository

    a. Click on Settings




    b. Go to Webhooks
    c. Click on ‘Add webhook’






    d. Now you need to enter Payload URL and that url should be your ‘Jenkins url’ followed by                     ‘/github-webhook/’
        So it will look like this ‘http://192.168.0.103:8080/github-webhook/

    e. Select ‘Content Type’ as ‘application/json’

    f. Click on Add button




Webhook added successfully




Now whenever you push the changes to your Git repository then Jenkins will trigger the build and it will start execution.

Note: If Jenkins build is not triggering that's mean the connection is not being established git webhooks to Jenkins

If you want to try to run Jenkins on localhost, the other way around is that,
1. install ngrok: https://ngrok.com/download 
which expose localhost urls over internet. 
2.After installation of the ngrok run it e.g./ngrok http 8080
It will give you a url like this: http://3b2db437.ngrok.io

Now under payloadUrl: type the url as:http://3b2db437.ngrok.io:8080/github-webhook/

Now the localhost Jenkins setup would run and the payload error would be gone.

Note: In above URL, you mentioned 8080 again. Since the url generated on ngrok already contains this, adding it again would result in service timeout error : "We couldn’t deliver this payload: Service Timeout".

Friday, March 26, 2021

Jenkins Most important Interview question and answer


1. What is CI?



Continuous Integration is a software development practice where members of a team integrate their work frequently, usually each person integrates at least daily - leading to multiple integrations per day. Each integration is verified by an automated build (including test) to detect integration errors as quickly as possible.
Jenkins was earlier referred to as Hudson


Difference made by Jenkins

Pre Jenkins:
Source code was completely built and then tested
Bugs identified during testing in the source code, should be fixed and then re-tested
Slows the software delivery, as the entire process is manual

Post Jenkins :
Once code change is committed, Jenkins automatically takes care of the build , test and reporting of results.


1) What is Jenkins?
Answer: Jenkins is a free open source Continuous Integration tool and automation server to monitor continuous integration and delivery. It is written in Java.

It is known as an automated Continuous Delivery tool that helps to build and test the software system with easy integration of changes to the system. Jenkins follows Groovy Scripting.

Also, it enables developers to continuously check in their code and also analyze the post-build actions. The automation testers can use to run their tests as soon as the new code is added or code is modified.


2) What are the features of Jenkins?
Answer: Jenkins comes with the following features:
Free open source.
Easy installation on various operating systems.
Build Pipeline Support.
Workflow Plugin.
Test harness built around JUnit.
Easy upgrades.
Rapid release cycle.
Easy configuration setup.
Extensible with the use of third-party plugins.


3) What are the advantages of Jenkins? Why we use Jenkins?
Answer: Jenkins is used to continuously monitor the large code base in real-time. It enables developers to find bugs in their code and fix them. Email notifications are made to the developers regarding their check-ins as a post-build action.

Advantages of Jenkins are as follows:
Build failures are cached during the integration stage.
Notifies the developers about build report status using LDAP (Lightweight Directory Access Protocol) mail server.
Maven release project is automated with simple steps.
Easy bug tracking.
Automatic changes get updated in the build report with notification.
Supports Continuous Integration in agile development and test-driven development.


4) Mention some of the important plugins in Jenkins?
Answer: Plugins in Jenkins includes:
Gits
Maven 2 Project
HTML Publisher
Copy Artcraft
Join
Green Balls
Amazon EC2



5) What is Continuous Integration in Jenkins?
Answer: Continuous integration is the process of continuously checking-in the developer’s code into a version control system and triggering the build to check and identify bugs in the written code.

This is a very quick process and also gives them a chance to fix the bugs. Jenkins is one such continuous integration tool.

In software development, multiple developers work on different software modules. While performing integration testing all the modules are being integrated together. It is considered as the development practice to integrate the code into the source repository

Whenever the programmer/developer makes any change to the current code, then it automatically
gets integrated with the system running on the tester’s machine and makes the testing task easy and speedy for the system testers.

Continuous Integration comprises of:
Development and Compilation
Database Integration
Unit Testing
Production Deployment
Code Labeling
Functional Testing
Generating and Analyzing Reports


6) What is the difference between Hudson and Jenkins?
Answer: There is no difference between Hudson and Jenkins. Hudson was the former name of Jenkins, after going through several issues the name was changed to Jenkins.


7) What is Groovy in Jenkins?
Answer: Groovy is the default scripting language that is being used in the development of JMeter Version 3.1.

Currently Apache Groovy is the dynamic object-oriented programming language that is used as a scripting language for the Java platform. Apache Groovy comes with some useful features such as Java Compatibility and Development Support.


8) Which command is used to start Jenkins?
Answer: You can follow the below-mentioned steps to start Jenkins:
Open Command Prompt
From the Command Prompt browse the directory where Jenkins. war resides
Run the command given below:D:\>Java –jar Jenkins.war


9) What is Jenkinsfile?
Answer: The text file where all the definitions of pipelines are defined is called Jenkinsfile. It is being checked in the source control repository.


10) What is the difference between Continuous Integration, Continuous Delivery, and Continuous Deployment?
Answer: The diagrammatic representation given below can elaborate on the differences between Continuous Integration, Continuous Delivery, and Continuous Deployment more precisely.

Continuous Integration:



(It involves keeping the latest copy of the source code at a commonly shared hub where all the developers can check to fetch out the latest change in order to avoid conflict.)

Continuous Delivery:



(Manual Deployment to Production. It does not involve every change to be deployed.)

Continuous Deployment:



(Automated Deployment to Production. Involves every change to be deployed automatically.)


11) What is Jenkins Pipeline? What is a CI CD pipeline?
Answer: The pipeline can be defined as the suite of plugins supporting the implementation and integration of continuous delivery pipelines in Jenkins.

Continuous integration or continuous delivery pipeline consists of build, deploy, test, release pipeline. The pipeline feature saves a lot of time and error in maintaining the builds. Basically, a pipeline is a group of build jobs that are chained and integrated in sequence.


12) What are Scripted Pipelines in Jenkins?
Answer: Scripted Pipeline follows Groovy Syntax as given below:
Node { 
 }

 In the above syntax, the node is a part of the Jenkins distributed mode architecture, where there are two types of node, Master which handle all the tasks in the development environment and the Agent is being used to handle multiple tasks individually.


13) What are Declarative Pipelines in Jenkins?
Answer: Declarative Pipelines are the newest additions to Jenkins that simplify the groovy syntax of Jenkins pipelines (top-level pipeline) with some exceptions, such as:

No semicolon to be used as a statement separator. The top-level pipeline should be enclosed within block viz;

The common syntax is:
pipeline {
 /* Declarative Pipeline */
 }


Blocks must contain Sections, Directives, steps or assignments.

pipeline {
                     agent any 
                     stages { 
stage(‘Build’) { 
                 steps { 
// Statements… 

                     } 
                     }
  stage (‘Test’) { 
                 steps {
 // Statements… 
             }
             }
         }
     }


The above code has 3 major elements
Pipeline: The block of script contents.
Agent: Defines where the pipeline will start running from.
Stage: The pipelines contain several steps enclosed in the block called Stage.


14) What is SCM? Which SCM tools are supported in Jenkins?
Answer:
SCM stands for Source Control Management.
SCM module specifies the source code location.
The entry point to SCM is being specified as jenkins_jobs.scm.
The job specified with ‘scm’ attribute accepts multiple numbers of SCM definitions.

The SCM can be defined as:

scm:
     name: eloc – scm 
     scm: 
         git:
         url: ssh://Jenkins.org/eloc.git


Jenkins supported SCM tools include:
CVS
Git
Perforce
AccuRev
Subversion
Clearcase
RTC
Mercurial


15) Which CI Tools are used in Jenkin?
Answer: Jenkins supported the following CI tools:
Jenkins
GitLab CI
Travis CI
CircleCI
Codeship
Go CD
TeamCity
Bamboo


16) Which commands can be used to start Jenkins manually?
Answer: You can use the following commands to start Jenkins manually:
1. (Jenkins_url)/restart: To force restart without waiting for build completion.
2. (Jenkin_url)/safeRestart: Waits until all the build gets completed before restarting.


17) Which Environmental Directives are used in Jenkins?
Answer: Environmental Directives is the sequence that specifies pairs of the key-values called Environmental Variables for the steps in the pipeline.


18) What are Triggers?

Answer: Trigger in Jenkins defines the way in which the pipeline should be executed frequently. PollSCM, Cron, etc are the currently available Triggers.


19) What is Agent Directive in Jenkins?
Answer: The Agent is the section that specifies the execution point for the entire pipeline or any specific stage in the pipeline. This section is being specified at the top-level inside the pipeline block.


20) How to make sure that your project build does not break in Jenkins?
Answer: You need to follow the below-mentioned steps to make sure that the Project build does not break:
1. Clean and successful installation of Jenkins on your local machine with all unit tests.
2. All code changes are reflected successfully.
3. Checking for repository synchronization to make sure that all the differences and changes related to config and other settings are saved in the repository.


22) How will you define Post in Jenkins?
Answer: Post is a section that contains several additional steps that might execute after the completion of the pipeline. The execution of all the steps within the condition block depends upon the completion status of the pipeline.

The condition block includes the following conditions – changed success, always, failure, unstable and aborted.


23) What are Parameters in Jenkins?
Answer: Parameters are supported by the Agent section and are used to support various use-cases pipelines. Parameters are defined at the top-level of the pipeline or inside an individual stage directive.


24) How you can set up a Jenkins job?
Answer: Setting up a new job in Jenkins is elaborated below with snapshots:

Step 1: Go to the Jenkins Dashboard and log in with your registered login credentials.



Step 2: Click on the New Item that is shown in the left panel of the page.



Step 3: Click on the Freestyle Project from the given list on the upcoming page and specify
the item name in the text box.



Step 4: Add the URL to the Git Repository.



Step 5: Go to the Build section and click on the Add build step => Execute Windows batch
command.



Step 6: Enter the command in the command window as shown below.



Step 7: After saving all the settings and changes click on Build Now.



Step 8: To see the status of the build click on Console Output.




25) What are the two components (pre-requisites) that Jenkins is mainly integrated with?
Answer: Jenkins integrates with:
Build tools/ Build working script like Maven script.
Version control system/Accessible source code repository like Git repository.


26) How can You Clone a Git Repository via Jenkins?
Answer: To create a clone repository via Jenkins you need to use your login credentials in the Jenkins System.

To achieve the same you need to enter the Jenkins job directory and execute the git config command.


27) How can you secure Jenkins?
Answer: Securing Jenkins is a little lengthy process, and there are two aspects of securing Jenkins:

(i) Access Control which includes authenticating users and giving them an appropriate set of permissions, which can be done in 2 ways.
Security Realm determines a user or a group of users with their passwords.
Authorization Strategy defines what should be accessible to which user. In this case, there might be different types of security based on the permissions granted to the user such as Quick and simple security with easy setup, Standard security setup, Apache front-end security, etc.

(ii) Protecting Jenkins users from outside threats.


28) How to create a backup and copy files in Jenkins?
Answer: In Jenkins, all the settings, build logs and configurations are stored in the JENKINS_HOME directory. Whenever you want to create a backup of your Jenkins you can back up JENKINS_HOME directory frequently.

It consists of all the job configurations and slave node configurations. Hence, regularly copying this directory allows us to keep a backup of Jenkins.

You can maintain a separate backfile and copy it whenever you need the same. If you want to copy the Jenkins job, then you can do so by simply replicating the job directory.


29) What is the use of Backup Plugin in Jenkins? How to use it?
Answer: Jenkins Backup Plugin is used to back up the critical configurations and settings in order to use them in the future in case of any failure or as per the need of time.

The following steps are followed to back up your settings by using the Backup Plugin.

Step 1: Go to the Jenkins Dashboard and click on Manage Jenkins.



Step 2: Click on Manage Plugins that appears on the next page.




Step 3: Go to Available Tab on the next page and search for ThinBackup.



Step 4: Once you choose the available option, it will start installing.

Step 5: Once it is installed the following screen will appear, from there choose Settings.



Step 6: Enter the necessary details like backup directory along with other options as shown on the below screen and save the settings. The backup will be saved to the specified Backup Directory.



Step 7: Go to the previous page to test whether the backup is happening or not by clicking on Backup Now as shown in the below image.



Step 8: At last, you can check the Backup Directory specified in the ThinBackup Settings. (Step 6) to check the whole backup


30) What is Flow Control in Jenkins?
Answer: In Jenkins, flow control follows the pipeline structure (scripted pipeline) that are being executed from the top to bottom of the Jenkins file.


31) What is the solution if you find a broken build for your project?
Answer: To resolve the broken build follow the below-mentioned steps:
Open console output for the build and check if any file change has missed.

OR
Clean and update your local workspace to replicate the problem on the local system and try to resolve it (In case you couldn’t find out the issue in the console output).


32) What are the basic requirements for installing Jenkins?
Answer: For installing Jenkins you need the following system configuration:
Java 7 or above.
Servlet 3.1
RAM ranging from 200 MB to 70+ GB depending on the project build needs.
2 MB or more of memory.


33) How can you define a Continuous Delivery Workflow?
Answer: The flowchart below shows the Continuous Delivery Workflow. Hope it will be much easier to understand with visuals.




34) What are the various ways in which the build can be scheduled in Jenkins?
Answer: The build can be triggered in the following ways:
After the completion of other builds.
By source code management (modifications) commit.
At a specific time.
By requesting manual builds.


35) Why is Jenkins called a Continuous Delivery Tool?
Answer: We have seen the Continuous Delivery workflow in the previous question, now let’s see the step by step process of why Jenkins is being called as a Continuous Delivery Tool:
Developers work on their local environment for making changes in the source code and push it into the code repository.
When a change is detected, Jenkins performs several tests and code standards to check whether the changes are good to deploy or not.
Upon a successful build, it is being viewed by the developers.
Then the change is deployed manually on a staging environment where the client can have a look at it.
When all the changes get approved by the developers, testers, and clients, the final outcome is saved manually on the production server to be used by the end-users of the product.

In this way, Jenkins follows a Continuous Delivery approach and is called the Continuous Delivery Tool.


36) Give any simple example of Jenkins script.

Answer: This is a Jenkins declarative pipeline code for Java:




Wednesday, March 17, 2021

Top 20 SQL Queries Interview Questions and Answers for Software Testing professionals - SET 3

 





1.     List all the employee details

SQL > Select * from employee;


2.     List all the department details

SQL > Select DEPARTMENT_ID from Employee;


3.     List all job details

SQL > Select JOB_ID from Employee


4.     List all the locations

SQL > Select loc from Employee;


5.List the latest updated record

select TOP 1 * from employee ORDER BY HIREDATE ASC

OR

select * from Employee where HIREDATE=(select max(HIREDATE) from Employee)

 

6.List out first name, last name, salary, commission for all employees

select LAST_NAME,FIRST_NAME, HIREDATE from employee

 

7. List out employee_id,last name,department id for all  employees and rename employee id as “ID  of the employee”, last name as “Name of the employee”, department id as  “department  ID”

 

Select employee_id “id of the employee”, last_name “name", department id as “department id” from employee;

 

8. List out the employees anual salary with their names only.

select first_name as Employee_names, (salary*12) as Anuual_Salary from employee

 

9. List the details about “SMITH”

Select * from employee where first_name=smith

 

10. List out the employees who are working in department 20

Select * from employee where department=20

 

11. List out the employee who are earning salary between 3000 to 4500

Select * from employee where salary between 3000 and 4500

 

12. List out the employees who are working in department 10 or 20

select * from employee where DEPARTMENT_ID in (10,20)

 

13. Find out the employees who are not working in department 10 or 30

select * from employee where DEPARTMENT_ID NOT in (10,20)

 

14. List out the employees whose name start with “S” and end with “H”

Select * from employee where last_name like ‘S%H’

 

15.List out the employees whose name length is 4 and start with “S”

Select * from employee where last_name like ‘S___’

 

16.     List out the employees who are working in department 20 and 30 and draw the salaries more than 3500

select * from employee where DEPARTMENT_ID in (20,30) AND salary=3500


17. List out the employees who are not receiving commission

select * from employee where COMM is NULL


Thursday, March 11, 2021

Most useful Docker commands for Automation tester

 













Docker command

Meaning

docker image pull selenium/standalone-chrome

 Here we are pulling chrome image

docker image ls

To list out the available/downloaded images

docker container create selenium/standalone-chrome

Here we are creating chrome container --it will return container id

docker container start containerID

To start container

docker container ps

To check whether container started or not

docker stop containerID

To stop the running container

docker stop containerID1 containerID2 containerID3

To stop the multiple running container at a time

docker restart containerID

To restart the container

docker images -f "reference=selenium/*:latest"

It will filter images starting with name ‘selenium’

/* - regular expression with latest version

docker ps

you can check the status whether the container is running or stopped

docker ps -a

It will show all the running containers

docker rm containerID

It will delete container

docker stop containerID1 containerID2 containerID3

It will delete multiple container at a time

docker rm imageID

It will delete image ‘imageID

docker inspect containerID

It will allow to inspect the container

docker kill containerID

It will terminate the running container

docker container run

It will pull, create and start the container by single command

docker run -p 4444:4444 selenium/standalone-chrome

It will map local port 4445 to docker port 4444

docker run -p 4444:4444 --name selgrid selenium/standalone-chrome

It will map local port 4445 to docker port 4444 with container name selegrid

docker run -d -p 4444:4444 --name selgrid selenium/standalone-chrome

It will map local port 4445 to docker port 4444 with container name selegrid and and all the process will be done in background

docker container --help

To seee the docker commands

docker exec –ti containerID  /bin /bash

It will allow you to take inside the container.

-t à indicate you want to see the output from the comd prompt

-I -àIndicate that you want to enter some input to cmd promt inside the container

 

 

How to run selenium test cases on Dockerised selenium grid (Traditional Approach)

Pre-requisites:

1. Download selenium/hub image

2. Download selenium/node-chrome-debug

3. Download selenium/node-Firefox-debug

docker run -d -p 4444:4444 --name selenium-hub selenium/hub:latest

or

 

docker run -d -p 4444:4444 - -restart always --name selenium-hub selenium/hub:latest

‘Docker run –d’ Running this docker container in back ground and ‘-p 4444:4444’ mapping the local port 4444 to selenium hub port 4444 and giving name

‘-  - name selenium-hub’ and creating container from ‘selenium/hub:latest’ with latest tag

docker logs containerID

To seee the logs

http://localhost:4444/grid/console

Hit the url in browser. You must see Hub created

Hub is created, Now our next task is to link chrome node to selenium hub

docker run –d - -link selenium-hub:hub selenium/node-chrome-debug:latest

 

or

 

docker run –d - -link selenium-hub:hub - -restart on-failure:3 selenium/node-chrome-debug:latest

Creating node and inking with selenium hub

‘docker run –d’ – To run the command in background

‘- -link selenium-hub:hub’ – linking with selenium hub with name ‘selenium-hub’ which is acts as hence giving name as hub ‘:hub’

‘selenium/node-chrome-debug:latest’ – linking with node with latest tag

http://localhost:4444/grid/console

Refresh the browser and you must be able to see chrome node created with version

Similarly, Now link firefox node to selenium hub

docker run –d - -link selenium-hub:hub selenium/node-firefox-debug:latest

 

or

 

docker run –d - -link selenium-hub:hub - -restart on-failure:3 selenium/node-firefox-debug:latest

Creating node and inking with selenium hub

‘docker run –d’ – To run the command in background

‘- -link selenium-hub:hub’ – linking with selenium hub with name ‘selenium-hub’ which is acts as hence giving name as hub ‘:hub’

‘selenium/node-firefox-debug:latest’ – linking with node with latest tag

http://localhost:4444/grid/console

Refresh the browser and you must be able to see firefox node created with version

Here we have created two node – if you have to connect again one firefox or chrome node, simply run

docker run –d - -link selenium-hub:hub selenium/node-firefox-debug:latest

http://localhost:4444/grid/console

Refresh the browser and you must be able to see three node 1 chrome node and 2 firefox

 

 

Container Restart policy-

Suppose you are executing test suit of 100 test cases and at 5th test case your Container got crashed in that

case all rest of the test cases will be failed. In this scenario, restart policy is very important and good practice.

 

No(Default)

This is default flag

On Failure

When container crashes then only docker daemon will restart the container

Always

Container will restart always unless docker daemon stopped

Unless-stopped

Restart always until manually stopped

How to update container with restart policy

docker update - -restart on-failure:2 containerID

Restart policy will be applied to particular container ,

It is updating a container with restart policy on failure 2 times

docker inspect containerID

you can check the container restart policy by

inspectin git

 

 

How to run selenium test cases on Dockerised selenium grid- Docker Compose file

Pre-requisites:

1.      Create docker-compose.yaml file in eclipse

2.      Open command promt

3.      Go to docker-compose file

4.      Entre below commands

docker-compose up

or

docker-compose up -d

It will execute compose file with extension .yaml, also you can see the logs

It will run in background

docker-compose –f selenium-compose.yaml up

Or If your docker compose file name is different, you can us can use this command

docker-compose down

It will stop your compose file

docker-compose ps

To see what an all serices and container there in docker compose file

docker-compose scale chrome=5

It will create 4 new chrome instances as 1 is already there in compose file.

Chrome –it Is service name in docker-compose file

 

 

 

Refer docker-compose file

version: "3"

services:

  selenium-hub:

    image: selenium/hub:latest

    container_name: selenium-hub

    restart: always

    ports:

      - "4444:4444"

  chrome:

    image: selenium/node-chrome-debug:latest

    volumes:

      - /dev/shm:/dev/shm

    depends_on:

      - selenium-hub

    environment:

      - HUB_HOST=selenium-hub

      - HUB_PORT=4444

    restart: on-failure:3

  chrome_79:

    image: selenium/node-chrome-debug:3.141.59-zinc

    volumes:

      - /dev/shm:/dev/shm

    depends_on:

      - selenium-hub

    environment:

      - HUB_HOST=selenium-hub

      - HUB_PORT=4444

    restart: on-failure:3

  firefox:

    image: selenium/node-firefox-debug:latest

    volumes:

      - /dev/shm:/dev/shm

    depends_on:

      - selenium-hub

    environment:

      - HUB_HOST=selenium-hub

      - HUB_PORT=4444

    restart: on-failure:3 

 

Real time Dashboard:

Important Link:

Getting started with elastic search

Download Elasticsearch

Install elastic search with docker

Environment variable configuration

Docker hub - elasticsearch

 

Docker command

docker run -p 9200:9200 -p 9300:9300 --name elasticsearch -e "discovery.type=single-node" docker.elastic.co/elasticsearch/elasticsearch:7.11.1

To run elastic search

http://localhost:9200/

To check elastic search is up and running

docker run -p 5601:5601 --name kibana --link elasticsearch:elasticsearch docker.elastic.co/kibana/kibana:7.11.1

To run Kibana

http://localhost:5601/

To check elastic search and kibana is up and running

 


Once elastic search is up and running, you can push your data to elastic search as shown below

URI - http://localhost:9200/world/countries

End point - http://localhost:9200/
Index – world
Type - countries

To see data in elastic, download chrome extension for elastic search:
1. Type elastic search chrome extension in google
2. Install ‘Elastic search head’ the extension
3. Launch the extension by clicking on elastic search head extension

You can see all your data in elastic using its extension.

 

 

***Manager your Docker container using Portainer 

docker run -d -p 9000:9000 -v /var/run/docker.sock:/var/run/docker.sock portainer/portainer

To run portainer

http://localhost:9000/

To check portainer is up and running

 

Setup Selenoid in docker

 

 

 

 

 

 

How to install Java on EC2

***************************************** How to install Java on EC2 ***************************************** To be continued, In this post...

All Time Popular Post