Skip to main content

How to Perform Unit Testing for AWS Glue Jobs in an Azure DevOps Pipeline

Unit testing AWS Glue jobs presents challenges due to the complexities involved in replicating the Glue environment locally. Fortunately, AWS offers a solution in the form of Glue container images available at Glue container images. These images allow us to perform unit tests effectively, as outlined in detail in the official documentation here. In this blog post, we will delve into the process of running AWS Glue job unit tests within an Azure DevOps pipeline and discuss how to calculate and publish code coverage for these tests.

To begin with, the Glue container image operates under a special user named GLUE_USER, which is referenced in the associated dockerfile.

USER glue_user

Assuming you have developed your Glue job in a Python script named myawesomegluejob.py, which is stored in an Azure DevOps (AzDO) Git repository, creating a pipeline for this purpose might initially seem straightforward. However, executing build steps directly within the Glue container is not feasible due to permission constraints with the GLUE_USER.

To overcome this limitation, our approach involves leveraging Docker commands in the pipeline to fetch the Glue image and subsequently mounting the Azure DevOps pipeline's file structure inside the Glue container. This facilitates the sharing of test results and code coverage data back to the Azure DevOps pipeline for future utilization.

By default, the Azure DevOps pipeline file system is not writable by the GLUE_USER. To address this, we must grant access to all users by executing the command chmod -R 0777 $(Build.SourcesDirectory).

Next, we can execute the following command:

docker run -v $(Build.SourcesDirectory):/home/glue_user/workspace -w /home/glue_user/workspace public.ecr.aws/glue/aws-glue-libs:glue_libs_4.0.0_image_01 -c "pip install pytest pytest-azurepipelines pytest-cov; python3 -m pytest test --doctest-modules --junitxml=junit/test-results.xml --cov=main --cov-report=xml"

This command effectively mounts $(Build.SourcesDirectory) into the /home/glue_user/workspace folder within the container. By setting the working directory to /home/glue_user/workspace, we proceed to execute a series of commands that install the necessary Python libraries and perform the unit tests. Consequently, a coverage.xml file is generated at $(Build.SourcesDirectory). However, as this file is created within the container, it contains relative paths of the container in its sources node. To rectify this, we conduct a string replacement using the sed command.

Here's the relevant snippet encompassing the aforementioned steps in the Azure DevOps pipeline:

- job: 'Scan_and_Build'
  steps:
      - script: |
          docker pull public.ecr.aws/glue/aws-glue-libs:glue_libs_4.0.0_image_01
      displayName: 'Pull Glue Image'

      - script: |
          chmod -R 0777 $(Build.SourcesDirectory)
          docker run -v $(Build.SourcesDirectory):/home/glue_user/workspace -w /home/glue_user/workspace public.ecr.aws/glue/aws-glue-libs:glue_libs_4.0.0_image_01 -c "pip install pytest pytest-azurepipelines pytest-cov; python3 -m pytest test --doctest-modules --junitxml=junit/test-results.xml --cov=main --cov-report=xml"
          sed -i "s|/home/glue_user/workspace|$(Build.SourcesDirectory)|g" $(Build.SourcesDirectory)/coverage.xml
      displayName: 'Run tests'

      - task: PublishTestResults@2
      condition: succeeded()
      inputs:
          testResultsFiles: '**/test-*.xml'
      displayName: 'Publish unit test results'

Conclusion

This blog post has effectively demonstrated how to perform unit tests for AWS Glue jobs within an Azure DevOps pipeline. By leveraging Glue container images and integrating Docker commands, it becomes possible to seamlessly execute unit tests and publish code coverage data, thus ensuring the reliability and stability of your Glue jobs.

Comments

Popular posts from this blog

Creating a Smart Playlist

A few days earlier I was thinking that wouldn't it be nice if I had something which will automatically generate a playlist for me with no artists repeated. Also, it would be nice if I could block those artists which I really hate (like Himesh Reshammiya!). Since I couldn't find anything already available, I decided to code it myself. Here is the outcome -  This application is created entirely in .NET Framework 4/WPF and uses Windows Media Player Library as its source of information. So you have to keep your Windows Media Player Library updated for this to work. It is tested only on Windows 7/Vista. You can download it from here . UPDATE : You can download the Windows XP version of the application here . Please provide your feedback!

Integrating React with SonarQube using Azure DevOps Pipelines

In the world of automation, code quality is of paramount importance. SonarQube and Azure DevOps are two tools which solve this problem in a continuous and automated way. They play well for a majority of languages and frameworks. However, to make the integration work for React applications still remains a challenge. In this post we will explore how we can integrate a React application to SonarQube using Azure DevOps pipelines to continuously build and assess code quality. Creating the React Application Let's start at the beginning. We will use npx to create a Typescript based React app. Why Typescript? I find it easier to work and more maintainable owing to its strongly-typed behavior. You can very well follow this guide for jsx based applications too. We will use the fantastic Create-React-App (CRA) tool to create a React application called ' sonar-azuredevops-app '. > npx create-react-app sonar-azuredevops-app --template typescript Once the project creation is done, we

Centralized Configuration for .NET Core using Azure Cosmos DB and Narad

We are living in a micro services world. All these services are generally hosted in Docker container which are ephemeral. Moreover these service need to start themselves up, talk to each other, etc. All this needs configuration and there are many commercially available configuration providers like Spring Cloud Config Server, Consul etc. These are excellent tools which provide a lot more functionality than just storing configuration data. However all these have a weakness - they have a single point of failure - their storage mechanism be it a file system, database etc. There are ways to work around those but if you want a really simple place to store configuration values and at the same time make it highly available, with guaranteed global availability and millisecond reads, what can be a better tool than Azure Cosmos DB! So I set forth on this journey for ASP.NET Core projects to talk to Cosmos DB to retrieve their configuration data. For inspiration I looked at Steeltoe Con