Search by labels

Friday, March 19, 2021

How To Begin Your Presentation with Simon Sinek

  • Start with a story - let the audience know why and what are you presenting.
  • Present with the desire to give, not to sell - engage your audience by sharing your ideas without making it transactional.
  • Leave the credentials, facts and figures for the latter part of the presentation - this information goes to the Neocortex which has little to no influence in motivating your audience to be fully engaged with your presentation.
 

Monday, February 15, 2021

Test Automation for REST & GraphQL APIs

Backend test automation, the middle layer of the Testing Pyramid, is an area that no efficient test automation strategy should overlook. It's quick, it's stable and luckily with the use of specific Java libraries, it's also easy to implement.

This article will present a Backend test automation framework built with JUnit 5, REST Assured and a specific library for GraphQL file manipulation.

Project structure and setup

As it is a Maven project, all the needed libraries can be easily imported through the POM file.

Two helper classes are also needed:

  •  one for preparing the GraphQL payload, that can be checked here and is shown below:

public static String prepareGraphqlPayload(Map<String, String> variables, String queryFileLocation) {
        File file = new File(queryFileLocation);
        ObjectNode objectNode = new ObjectMapper().createObjectNode();
        for (Map.Entry<String, String> entry : variables.entrySet()) {
            objectNode.put(entry.getKey(), entry.getValue());
        }
        String graphqlPayload = null;
        try {
            graphqlPayload = GraphqlTemplate.parseGraphql(file, objectNode);
        } catch (IOException e) {
            e.printStackTrace();
        }
        return graphqlPayload;
    }

  • one for preparing the JSON object that can be checked here and is shown below:

public static JSONObject buildJsonObject(String location) {
        JSONParser parser = new JSONParser();
        JSONObject jsonObject = null;
        try {
            jsonObject = (JSONObject) parser.parse(new FileReader(location));
        } catch (IOException e) {
            e.printStackTrace();
        } catch (ParseException e) {
            e.printStackTrace();
        }
        return jsonObject;
    }

Specifying the APIs and test data

The APIs under test will be:

As test data, we need:

  • for the GraphQL API:
  1. the Pokemon GraphQL query file
  2. the list of Pokemon and their attributes as a .csv file


Writing the test scenarios

  • The GraphQL test class can be checked here and the main test is detailed below:

@ParameterizedTest
@CsvFileSource(resources = "/graphql/pokemons.csv", numLinesToSkip = 1)
void testGraphQL(String id, String pokemonName, String weight, String height) {
        RestAssured.baseURI = "https://pokeapi-graphiql.herokuapp.com";
        Map<String, String> variables = new HashMap<>();
        variables.put("number", id);
        String graphqlPayload = GraphqlUtil.prepareGraphqlPayload(variables, "src/test/resources/graphql/pokemon.graphql");
        given().log().body()
                .contentType(ContentType.JSON)
                .body(graphqlPayload)
                .post("/graphql")
                .then()
                .statusCode(200)
                .body("data.pokemon.name", equalTo(pokemonName))
                .body("data.pokemon.weight", equalTo(weight))
                .body("data.pokemon.height", equalTo(height));
        logger.info("GraphQL request successful for " + pokemonName);
    }

Although a test scenario for a GraphQL API, the REST Assured library can be easily used and we can take advantage of its straightforward features for making the initial setup, executing the request and verifying the returned data.

  • The REST test class can be checked here and the initialisation step and first test case are detailed below:

    @BeforeAll
    static void initialize() {
        RestAssured.baseURI = "https://petstore.swagger.io/v2";
        petCreatedJson = buildJsonObject("src/test/resources/rest/pet_created.json");
        petUpdatedJson = buildJsonObject("src/test/resources/rest/pet_updated.json");
    }

    @Test
    @Order(1)
    void testPetCreation() {
        given().log().uri()
                .contentType(ContentType.JSON)
                .body(petCreatedJson)
                .post("/pet")
                .then()
                .statusCode(200).extract().response()
                .then().body("name", equalTo(petCreatedJson.get("name")));
        logger.info("Pet entry with id " + petCreatedJson.get("id") + " and name " + petCreatedJson.get("name") + " successfully CREATED");
    }

The @BeforeAll step handles the initial setup, setting the baseURI and building the two JSON objects that will be used in testing.

Because this test class was designed to verify the entire flow, with @Order used to guarantee the correct timing of the test scenarios, the execution flow will be the following:

  1. Create the pet entry
  2. Verify the entry with correct info was created
  3. Update the pet entry
  4. Verify the entry was updated with the correct info
  5. Delete the pet entry
  6. Verify the entry was deleted
Thus, this test suite can run independently each time and does not rely on previously available data, nor does it spam the database after each execution.

Sunday, February 7, 2021

Fast-Forward Frontend Test Automation with Selenide

Tired of having to write a lot of code (the Gherkin feature file, the Runner class, the Steps and the Page Object classes) just for a single Frontend test?

Also, you don't need the detailed and verbose reports offered by automation libraries like Cucumber or Serenity?

Then maybe Selenide is the solution you are looking for.

Selenide is a Selenium WebDriver wrapper, it builds on this time-tested library and comes with new and powerful methods that can be checked here.

This article will present a sample frontend test automation framework built with Selenide and JUnit 5. It can be checked here and let's go step by step into its' structure.

1. Setting up Selenide and JUnit 5

As it is a Maven project, simply add the needed dependencies in the pom file:

<dependency>
<groupId>com.codeborne</groupId>
<artifactId>selenide</artifactId>
<version>${selenide.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter</artifactId>
<version>${junit.version}</version>
<scope>test</scope>
</dependency>

2. Define the test data

The showcased test is for a demo contact form so the test data can be easily centralised in a .csv file, thus taking advantage of the very useful @ParametrizedTest feature from JUnit 5.

3. Implement the test

Having JUnit in place, we can use its' annotations to best structure the test for the above contact form, thus:

  • We will open the webpage in the @BeforeEach step:
@BeforeEach 
public void loadWebpage() { 
 open("http://automationpractice.com/index.php?controller=contact"); 
     }
  • Implement the actual test steps in a @ParametrizedTest using as test data the above .csv file:
@ParameterizedTest
@CsvFileSource(resources = "/contactForm.csv", numLinesToSkip = 1)
public void testContactForm(String subjectHeading, String emailAddress, String orderReference, String message, String uploadFile) {
        $(By.id("id_contact")).selectOption(subjectHeading);
        $(By.id("email")).setValue(emailAddress);
        $(By.id("id_order")).setValue(orderReference);
        $(By.id("message")).setValue(readTextFileAsString(message));
        $(By.id("fileUpload")).uploadFile(new File(uploadFile));
        $(By.id("submitMessage")).click();
        $(By.className("alert-success")).shouldHave(Condition.text("Your message has been successfully sent to our team"));
    }

A special mention here for the UploadFile feature from Selenide that can save you from the dreadful workarounds of trying to interact with the native OS file select window.

  • Do the clean-up in the @AfterEach step:

@AfterEach
public void closeBrowser() {
        closeWebDriver();
    }

So, as seen, the main test case for this webpage can be written in a concise manner and in under 30 lines of code. Whereas going with the Gherkin approach would have implied exponentially more code and effort to structure it.

Thursday, January 28, 2021

The "Gut Feeling"

When information is incomplete and maybe even contradictory, but taking a decision is of the utmost urgency, you can ultimately rely on your gut feeling. Trust it, you might even save the world!

Monday, January 18, 2021

Parallel Automated Testing with Selenium Docker Hub

Test automation - all well and good, but what happens when the product you are testing is UI heavy and your test suite increases to hundreds of tests and takes hours to fully execute? Each extra minute of runtime takes you one inch away from the Holy Grail, namely CI/CD.

The answer is test parallelisation. Executing the tests in parallel can exponentially decrease runtime, thus allowing you to get the test results faster and release quicker into Production.

Opening multiple browser instances simultaneously may not work perfect each time, but luckily we have a bulletproof solution in Docker's Selenium Hub

This article will present a Frontend test automation framework built with Serenity and Java and the steps for configuring its parallel test execution in a Selenium Docker Hub.

1. Setup the Docker Hub

Depending on the machine's OS, Docker is installed in different ways. In this case, the operating system was MacOS so Docker Desktop was initially installed.

Now, using a Docker Compose .yml file that contains all the setup info, you can start the grid as per your requirements (the required Docker images will be downloaded automatically). In the terminal, go to the folder where the .yml file is located and run the following command, specifying the number of Firefox and Chrome containers (2 in this case):

$ docker-compose up -d --scale firefoxnode=2 --scale chromenode=2

The debug versions of the Firefox and Chrome images will allow you to connect to them using VNC Viewer in order to see realtime the test execution.

Access http://localhost:4445/grid/console and the up and running grid is shown.







2. Configure the automation framework for parallel test execution in the newly create grid

Update the serenity.properties file with the type and new location for the webdriver:

webdriver.remote.url = http://localhost:4445/wd/hub
webdriver.remote.driver = chrome
webdriver.remote.os = LINUX

Update in the pom.xml file the Maven Failsafe plugin configuration, plugin responsible in this case for test execution, to run the tests in parallel in the new grid setup:

<configuration>
<includes>
<include>**/*TestSuite.java</include>
</includes>
<reuseForks>true</reuseForks>
<argLine>-Xmx512m</argLine>
<parallel>classes</parallel>
<threadCount>2</threadCount>
<forkCount>2</forkCount>
</configuration>

3. Run the tests

In the terminal call the Failsafe plugin, which will execute all the TestSuite classes:

$ mvn clean verify

During runtime you will see the nodes that are in use as faded out:






Check the summary results in the console and the detailed HTML report:













Sunday, January 10, 2021

Upgrading a Test Automation Framework with Security Testing Capabilities

This article will cover the steps for adding security testing capabilities to an automated testing framework (designed for Backend & Frontend testing) built with JUnit5, Selenide and REST Assured.

The OWASP Zed Attack Proxy (ZAP) is a widely used web app scanner, free and open source. It comes in a wide array of setups and for this tutorial we will use the Docker version, running it headless mode in a Docker container.

Depending on the machine's OS, Docker is installed in different ways. In this case, the operating system was MacOS so Docker Desktop was initially installed.

Setting up ZAP Docker

1. Get the ZAP Docker image by running in terminal:

docker pull owasp/zap2docker-stable

2. Run the image:

docker run -u root -p 8090:8090 -i owasp/zap2docker-stable zap-x.sh -daemon -host 0.0.0.0 -port 8090 -config api.addrs.addr.name=.\* -config api.addrs.addr.regex=true -config api.disablekey=true -config scanner.attackOnStart=true -config view.mode=attack

3. Check the newly started ZAP scanner at http://localhost:8090/.

Refactoring the test automation framework so that the ZAP proxy will intercept the traffic

The test automation framework showcased in this article can be checked here. It has 3 main test classes:

- UiTest.java, built with Selenide for Frontend testing;

- RestAPITest.java, built with REST Assured for Backend testing (REST service);

- GraphQLTest.java, built with REST Assured and an additional GraphQL helper library for Backend testing (GraphQL service).

1. For the Backend tests, being built with REST Assured, this refactoring only means setting the proxy variable to the ZAP location with one simple line of code:

RestAssured.proxy("0.0.0.0", 8090, "http");

2. For the Frontend test, this refactoring is similar and implies adding the ZAP proxy when instantiating the WebDriver:

ChromeOptions options = new ChromeOptions();
Proxy proxy = new Proxy();
proxy.setHttpProxy("0.0.0.0:8090");
options.setCapability("proxy", proxy);
ChromeDriver chromeDriver = new ChromeDriver(options);

3. Run the tests and now ZAP will intercept the traffic at each test execution and identify security vulnerabilities.

Checking the results

Go to http://localhost:8090/OTHER/core/other/htmlreport and check the discovered vulnerabilities. They also come with a short description and additional resources for information.



















As seen in this article, with minimum effort you can improve your test automation giving it basic security testing capabilities, thus security vulnerabilities can be discovered early on in the development process when it's a lot faster and cheaper to fix them. 

Building secure products is a shared responsibility.

Monday, January 4, 2021

Marginal Gains in Practice

What do 5 minutes mean?
Nothing.
What do 5 minutes per day for an entire year mean?
A lot!



Sunday, January 3, 2021

Automated Test for Measuring a Website's Performance using Google PageSpeed Insights

PageSpeed Insights and Lighthouse are two free Google tools that we can use to measure a website's performance. Although with a common purpose, they are different by the fact that Lighthouse uses lab data only and measures more than performance data, while PageSpeed Insights focuses on performance metrics by analysing both lab and real-world data (the lab data being actually provided by Lighthouse).

This article will present an automated test built with Maven and two Java libraries - JUnit and RestAssured, that audits the performance of websites by using the features of the PageSpeed Insights tool.

First of all, PageSpeed Insights can also be used manually, simply go to its' homepage and check the metrics for your desired website:








The scores come also with a detailed explanation, a summary being:


The metrics scores and the perf score are colored according to these ranges:
  • 0 to 49 (red): Poor
  • 50 to 89 (orange): Needs Improvement
  • 90 to 100 (green): Good
To provide a good user experience, sites should strive to have a good score (90-100). A "perfect" score of 100 is extremely challenging to achieve and not expected. For example, taking a score from 99 to 100 needs about the same amount of metric improvement that would take a 90 to 94.


For the automated test we will use the PageSpeed Insights API which returns the audit results as a JSON object. To check it, simply: 

curl https://www.googleapis.com/pagespeedonline/v5/runPagespeed?url=${desiredWebsite}

With the general performance score being displayed in the categories section of the returned JSON:
"categories": {
      "performance": {
        "id": "performance",
        "title": "Performance",
        "score": 0.85

The implementation of the automated test can be found here, with:

  • JUnit being used to execute the tests, perform the @before and @after steps and load the websites list as a .csv file;
  • RestAssured performing the GET request and parsing the returned JSON;
  • And the actual test method asserting that the performance score is greater than or equal to 90 being:

@ParameterizedTest
@CsvFileSource(resources = "/pagespeedonline/websites.csv")
void testWebsite(String website) {
    RestAssured.baseURI = "https://www.googleapis.com";
    Double performanceScore;
    Response response = given().log().uri().
    when().get("/pagespeedonline/v5/runPagespeed?url=" + website).
    then().extract().response();
    assertThat(website, response.getStatusCode(), equalTo(200));
    performanceScore = Double.valueOf(response.path("lighthouseResult.categories.performance.score").toString());
    websiteScores.put(website, performanceScore);
    assertThat(performanceScore, greaterThanOrEqualTo(0.9));
}

Being a Java project built with Maven, this test can be easily added in a Jenkins job and executed periodically as a performance healthcheck.