Skip to main content

Integrating React with SonarQube using Azure DevOps Pipelines

In the world of automation, code quality is of paramount importance. SonarQube and Azure DevOps are two tools which solve this problem in a continuous and automated way. They play well for a majority of languages and frameworks. However, to make the integration work for React applications still remains a challenge. In this post we will explore how we can integrate a React application to SonarQube using Azure DevOps pipelines to continuously build and assess code quality.


Let's start at the beginning. We will use npx to create a Typescript based React app. Why Typescript? I find it easier to work and more maintainable owing to its strongly-typed behavior. You can very well follow this guide for jsx based applications too. We will use the fantastic Create-React-App (CRA) tool to create a React application called 'sonar-azuredevops-app'.

> npx create-react-app sonar-azuredevops-app --template typescript

Once the project creation is done, we will use Visual Studio Code (as it is my preferred editor) to open the app.

> cd sonar-azuredevops-app
> code .

At this point our basic out-of-the-box React application is ready. However it does not do much.


Apps created using CRA come with Jest pre-installed. Jest is a testing framework to test JavaScript and React code. We will also be using AirBnB's Enzyme tool as a mocking framework to test the React application created above. To do this, let's add the following packages as development dependencies.

> npm install --save-dev enzyme enzyme-adapter-react-16 enzyme-to-json @types/enzyme @types/enzyme-adapter-react-16 @types/enzyme-to-json

Enzyme allows us to perform Snapshot testing where we compare the html produced by our test to an expected output. The enzyme-to-json package helps us in reducing the amount of code we have to write to achieve the snapshot testing. Next we will integrate Jest with enzyme-to-json by adding the following to package.json.

"jest": {
    "snapshotSerializers": [
      "enzyme-to-json/serializer"
    ]
  }

Enzyme needs to be configured before it can be used with in the React application. To do that in setUpTest.ts, we will add the following code.

import { configure } from 'enzyme';
import Adapter from 'enzyme-adapter-react-16';
configure({ adapter: new Adapter() });

Finally we are ready to write some tests! In the src folder, let's add a React component called paragraph.tsx. It is a very simple component writing a paragraph with passed in text.

import React from 'react';
interface ParaProps {
    passedInText: string;
}
export default class Paragraph extends React.Component<ParaProps> {
    render() {
        return (
            <div className="container">
                <p>{this.props.passedInText}</p>
            </div>
        )
    }
}

Now, let's write some tests to test the code that we have just written. Create a file called paragraph.test.tsx and add the following test.

import React from "react";
import { shallow } from "enzyme";
import Paragraph from "./paragraph";
describe("Paragraph", () => {
    it('should render passed in string in a paragraph', () => {
        const paragraph = shallow(<Paragraph passedInText="Azure DevOps and SonarQube Rocks!" />);
        expect(paragraph).toMatchSnapshot();
    });
});

Let's run the following command. The tests should pass but it does not tell us the code coverage.

> npm run test

Jest comes with a built-in code coverage analyzer. In the scripts section of package.json, we just have to add --coverage switch for Jest to produce code coverage. However, remember our aim to integrate this with SonarQube. To do this, we have to add the following two packages as development dependencies.

> npm install --save-dev jest-sonar-reporter sonarqube-scanner

jest-sonar-reporter converts the report produced by Jest into a format which SonarQube can understand. sonarqube-scanner allows us to upload the report to our SonarQube instance. We will configure jest-sonar-reporter by adding the following to our package.json which tells it to produce the report at reports/test-report.xml

"jestSonar": {
    "reportPath": "reports",
    "reportFile": "test-report.xml",
    "indent": 4
  }

We will add a file sonar-project.tsx at the root of the application (where package.json is located) and add the following code inside it.

const sonarqubeScanner = require("sonarqube-scanner");
sonarqubeScanner(
    {
        serverUrl: "#{sonarQubeServerUrl}#",
        token: "#{sonarQubeToken}#",
        options: {
            "sonar.projectName":"#{projectName}#",
            "sonar.projectKey":"#{projectKey}#",
            "sonar.sources": "./src",
            "sonar.exclusions": "**/*.test.*,**/__snapshots__/**,src/*.ts",
            "sonar.tests": "./src",
            "sonar.test.inclusions": "./src/**/*.test.tsx",
            "sonar.typescript.lcov.reportPaths": "coverage/lcov.info",
            "sonar.testExecutionReportPaths": "reports/test-report.xml",
        },
    },
    () => process.exit()
);

This file will allow us to contact SonarQube and upload the Code Analysis and Coverage report. Note the use of #{ and }# tokens. We will be using these to replace the tokens with actual values when running inside an Azure DevOps pipeline. 
  • serverUrl is the URL to your SonarQube instance
  • token is the security token assigned to your Sonar user
  • sonar.projectName is the the name of the project that you have configured in SonarQube.
  • sonar.projectKey is the project key for the SonarQube project
  • Optionally if you are using Developer Edition or above of SonarQube, you can also have a sonar.branchName property which will be the branch you are currently analysing.
  • sonar.sources is the base directory for all of your code. This is where our React application lives.
  • sonar.exclusions is everything you do not want Sonar to analyze. The most important one for me is that we don’t want to be analysis on our tests. 
  • sonar.tests is the location of all of your tests.
  • sonar.test.inclusions is a comma separated list of all files that should be treated as test files.
  • sonar.typescript.lcov.reportPaths is the path to the test coverage output file from jest. By default this will be coverage/lcov.info
  • sonar.testExecutionReportPaths is the path to the jest-sonar-reporter output file that we configured in previous steps.
We will also modify the commands in the scripts section of package.json to look like this:

"test": "react-scripts test --coverage --watchAll=false --testResultsProcessor jest-sonar-reporter",
"sonar": "node sonar-project.tsx"
If we run 'npm run test' now, we should see similar output:



At this point, if we replace the tokens in above file and run 'npm run sonar', we will get the Code Analysis and Coverage reports in our SonarQube instance. Running the command will also generate a lot of extra files which we don't want in our git repo so we will add the following to our .gitignore file.

# testing
/.scannerwork
/coverage
/reports
But since we don't have a SonarQube instance yet, we will hold off running that command till we set up one in next section.


The fastest way to set up SonarQube is to use Docker images. To allow Azure DevOps to reach the SonarQube at a public url, we will deploy it inside a Azure Container Instance.



We will log in to our SonarQube at the provided URL and the default username/password (admin/admin) should see an empty dashboard like this.



Next we will generate a token from SonarQube which will allow Azure DevOps to authenticate and upload the Code Analysis and Coverage reports to SonarQube.


Next create a project in SonarQube and note down its name and key that you used while creating. We will use that in next section.


We are now ready to create an Azure DevOps pipeline which brings it all together. We will use the YAML based pipeline creation model. We will start by adding the variables defined in sonar-project.tsx in our pipeline. We will use the Azure DevOps Build Pipeline feature to store our variables. 



We will use the Replace Token task to replace the tokens with values defined above. We will also add the npm install, test, sonar, build tasks and then publish the build output as Build artifact.

Now the running the npm run sonar command will publish our code coverage to SonarQube but we also want our code to fail if we don't meet the quality gate defined for our project. To do that we will add a PowerShell task where we query the SonarQube API and check if our branch (in case of Developer edition and above) meets the Quality Gate.

$token = [System.Text.Encoding]::UTF8.GetBytes("$(sonarQubeToken)" + ":")
$base64 = [System.Convert]::ToBase64String($token)
Start-Sleep -s 10
$basicAuth = [string]::Format("Basic {0}", $base64)
$headers = @{ Authorization = $basicAuth }
$result = Invoke-RestMethod -Method Get -Uri "$(sonarQubeServerUrl)api/project_branches/list?project=$(projectKey)" -Headers $headers
$result | ConvertTo-Json | Write-Host
$branch = $result.branches | Where-Object { $_.name -eq "$(branchName)" }
if ($branch.status.qualityGateStatus -eq 'OK') {
Write-Host "Quality Gate Succeeded"
}else{
throw "Quality gate failed"
}

Our final pipeline code would look like this:

# Node.js with React
# Build a Node.js project that uses React.
# Add steps that analyze code, save build artifacts, deploy, and more:
# https://docs.microsoft.com/azure/devops/pipelines/languages/javascript
trigger:
- master
pool:
  vmImage: 'ubuntu-latest'
steps:
- task: NodeTool@0
  inputs:
    versionSpec: '10.x'
  displayName: 'Install Node.js'
- task: replacetokens@3
  inputs:
    targetFiles: 'sonar-project.tsx'
    encoding: 'auto'
    writeBOM: true
    actionOnMissing: 'warn'
    keepToken: false
    tokenPrefix: '#{'
    tokenSuffix: '}#'
    useLegacyPattern: false
    enableTelemetry: true
- script: |
    npm install
  displayName: 'npm install'
- task: Npm@1
  displayName: 'npm test'
  inputs:
    command: 'custom'
    workingDir: '$(Build.SourcesDirectory)'
    customCommand: 'run test'
- task: Npm@1
  displayName: 'Publish Quality Gate Result'
  inputs:
    command: 'custom'
    workingDir: '$(Build.SourcesDirectory)'
    customCommand: 'run sonar'
- task: PowerShell@2
  displayName: 'Break on Quality Gate Failure'
  inputs:
    targetType: 'inline'
    script: |
      $token = [System.Text.Encoding]::UTF8.GetBytes("$(sonarQubeToken)" + ":")
      $base64 = [System.Convert]::ToBase64String($token)
      Start-Sleep -s 10
      $basicAuth = [string]::Format("Basic {0}", $base64)
      $headers = @{ Authorization = $basicAuth }
      $result = Invoke-RestMethod -Method Get -Uri "$(sonarQubeServerUrl)api/project_branches/list?project=$(projectKey)" -Headers $headers
      $result | ConvertTo-Json | Write-Host
      $branch = $result.branches | Where-Object { $_.name -eq "$(branchName)" }
      if ($branch.status.qualityGateStatus -eq 'OK') {
      Write-Host "Quality Gate Succeeded"
      }else{
      throw "Quality gate failed"
      }
- script: npm run build
  workingDirectory: '$(Build.SourcesDirectory)'
  displayName: "npm build"
  env:
    CI: true
- task: ArchiveFiles@2
  inputs:
    rootFolderOrFile: '$(Build.SourcesDirectory)/build'
    includeRootFolder: false
    archiveType: 'zip'
    archiveFile: '$(Build.SourcesDirectory)/build.zip'
    replaceExistingArchive: true
- task: PublishPipelineArtifact@1
  inputs:
    targetPath: '$(Build.SourcesDirectory)/build.zip'
    artifact: 'build'
    publishLocation: 'pipeline'

If we run the pipeline, we will see the output being generated and Code Analysis and Code Coverage result in SonarQube. Hooray!!






In this post, we learnt how to integrate a React App with SonarQube using Azure DevOps pipeline. We started from scratch and built our way up. You can find the entire code used above here - https://dev.azure.com/mayank/_git/ReactSonarQubeDevOps

Hope this helps!

Comments

Popular posts from this blog

Centralized Configuration for .NET Core using Azure Cosmos DB and Narad

We are living in a micro services world. All these services are generally hosted in Docker container which are ephemeral. Moreover these service need to start themselves up, talk to each other, etc. All this needs configuration and there are many commercially available configuration providers like Spring Cloud Config Server, Consul etc. These are excellent tools which provide a lot more functionality than just storing configuration data. However all these have a weakness - they have a single point of failure - their storage mechanism be it a file system, database etc. There are ways to work around those but if you want a really simple place to store configuration values and at the same time make it highly available, with guaranteed global availability and millisecond reads, what can be a better tool than Azure Cosmos DB!
So I set forth on this journey for ASP.NET Core projects to talk to Cosmos DB to retrieve their configuration data. For inspiration I looked at Steeltoe Configuratio…

Enabling IT in Healthcare by simulating Patient Data

In the current times of pandemics and disease outbreaks, it is of paramount importance that we leverage software to treat diseases. One important aspect of healthcare is Patient Management. IT systems have to be developed which can support management at an enormous scale. Any development of such good software system requires data which is as realistic as possible but not real. There has to be a fine balance between privacy and enabling developers to anticipate the myriad health scenarios that may occur. Towards this initiative, healthcare industry has been standardizing around the HL7 (Health Level 7) protocol. The HL7 protocol itself has undergone transformation from pipe-based formats (v2.x) to JSON/XML based (FHIR) modern formats. FHIR is the future of HL7 messages and is being increasingly adopted. However given the slow nature and reluctance by healthcare providers to change or upgrade their systems, a majority of hospitals still use the HL7 2.3 format.
FHIR is a modern format. …

The Art of Ogling

Me and my roommate were returning from a movie theater when I noticed a girl in a black dress and black goggles who seemed to be pretty. Maybe I looked for a second too long at her that I was chided by my roommate. "Dude, don't look at girls like that!!", he said out aloud, much to my embarrassment and his delight. This made me think and write about - How the hell do you look at girls?

Let me set the ground rules before you read on. Don't despise men who stare at girls or think of them as perverts. They are doing a public service. This is how it works - Girls spend time, effort and money in buying makeup, clothes and other numerous accessories which make them look good. Have you ever thought why they go through so much trouble?